Oct 02 07:11:54 localhost kernel: Linux version 5.14.0-620.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025
Oct 02 07:11:54 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct 02 07:11:54 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 02 07:11:54 localhost kernel: BIOS-provided physical RAM map:
Oct 02 07:11:54 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct 02 07:11:54 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct 02 07:11:54 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct 02 07:11:54 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct 02 07:11:54 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct 02 07:11:54 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct 02 07:11:54 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct 02 07:11:54 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct 02 07:11:54 localhost kernel: NX (Execute Disable) protection: active
Oct 02 07:11:54 localhost kernel: APIC: Static calls initialized
Oct 02 07:11:54 localhost kernel: SMBIOS 2.8 present.
Oct 02 07:11:54 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct 02 07:11:54 localhost kernel: Hypervisor detected: KVM
Oct 02 07:11:54 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct 02 07:11:54 localhost kernel: kvm-clock: using sched offset of 4021493441 cycles
Oct 02 07:11:54 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct 02 07:11:54 localhost kernel: tsc: Detected 2799.886 MHz processor
Oct 02 07:11:54 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Oct 02 07:11:54 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Oct 02 07:11:54 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct 02 07:11:54 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct 02 07:11:54 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct 02 07:11:54 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct 02 07:11:54 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct 02 07:11:54 localhost kernel: Using GB pages for direct mapping
Oct 02 07:11:54 localhost kernel: RAMDISK: [mem 0x2d7c4000-0x32bd9fff]
Oct 02 07:11:54 localhost kernel: ACPI: Early table checksum verification disabled
Oct 02 07:11:54 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct 02 07:11:54 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 02 07:11:54 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 02 07:11:54 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 02 07:11:54 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct 02 07:11:54 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 02 07:11:54 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 02 07:11:54 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct 02 07:11:54 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct 02 07:11:54 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct 02 07:11:54 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct 02 07:11:54 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct 02 07:11:54 localhost kernel: No NUMA configuration found
Oct 02 07:11:54 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct 02 07:11:54 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Oct 02 07:11:54 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct 02 07:11:54 localhost kernel: Zone ranges:
Oct 02 07:11:54 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct 02 07:11:54 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct 02 07:11:54 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct 02 07:11:54 localhost kernel:   Device   empty
Oct 02 07:11:54 localhost kernel: Movable zone start for each node
Oct 02 07:11:54 localhost kernel: Early memory node ranges
Oct 02 07:11:54 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct 02 07:11:54 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct 02 07:11:54 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct 02 07:11:54 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct 02 07:11:54 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct 02 07:11:54 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct 02 07:11:54 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct 02 07:11:54 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Oct 02 07:11:54 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct 02 07:11:54 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct 02 07:11:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct 02 07:11:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct 02 07:11:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct 02 07:11:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct 02 07:11:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct 02 07:11:54 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct 02 07:11:54 localhost kernel: TSC deadline timer available
Oct 02 07:11:54 localhost kernel: CPU topo: Max. logical packages:   8
Oct 02 07:11:54 localhost kernel: CPU topo: Max. logical dies:       8
Oct 02 07:11:54 localhost kernel: CPU topo: Max. dies per package:   1
Oct 02 07:11:54 localhost kernel: CPU topo: Max. threads per core:   1
Oct 02 07:11:54 localhost kernel: CPU topo: Num. cores per package:     1
Oct 02 07:11:54 localhost kernel: CPU topo: Num. threads per package:   1
Oct 02 07:11:54 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct 02 07:11:54 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct 02 07:11:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct 02 07:11:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct 02 07:11:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct 02 07:11:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct 02 07:11:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct 02 07:11:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct 02 07:11:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct 02 07:11:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct 02 07:11:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct 02 07:11:54 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct 02 07:11:54 localhost kernel: Booting paravirtualized kernel on KVM
Oct 02 07:11:54 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct 02 07:11:54 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct 02 07:11:54 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct 02 07:11:54 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Oct 02 07:11:54 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Oct 02 07:11:54 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Oct 02 07:11:54 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 02 07:11:54 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64", will be passed to user space.
Oct 02 07:11:54 localhost kernel: random: crng init done
Oct 02 07:11:54 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct 02 07:11:54 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct 02 07:11:54 localhost kernel: Fallback order for Node 0: 0 
Oct 02 07:11:54 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct 02 07:11:54 localhost kernel: Policy zone: Normal
Oct 02 07:11:54 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct 02 07:11:54 localhost kernel: software IO TLB: area num 8.
Oct 02 07:11:54 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct 02 07:11:54 localhost kernel: ftrace: allocating 49370 entries in 193 pages
Oct 02 07:11:54 localhost kernel: ftrace: allocated 193 pages with 3 groups
Oct 02 07:11:54 localhost kernel: Dynamic Preempt: voluntary
Oct 02 07:11:54 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Oct 02 07:11:54 localhost kernel: rcu:         RCU event tracing is enabled.
Oct 02 07:11:54 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct 02 07:11:54 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Oct 02 07:11:54 localhost kernel:         Rude variant of Tasks RCU enabled.
Oct 02 07:11:54 localhost kernel:         Tracing variant of Tasks RCU enabled.
Oct 02 07:11:54 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct 02 07:11:54 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct 02 07:11:54 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 02 07:11:54 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 02 07:11:54 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 02 07:11:54 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct 02 07:11:54 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct 02 07:11:54 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct 02 07:11:54 localhost kernel: Console: colour VGA+ 80x25
Oct 02 07:11:54 localhost kernel: printk: console [ttyS0] enabled
Oct 02 07:11:54 localhost kernel: ACPI: Core revision 20230331
Oct 02 07:11:54 localhost kernel: APIC: Switch to symmetric I/O mode setup
Oct 02 07:11:54 localhost kernel: x2apic enabled
Oct 02 07:11:54 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Oct 02 07:11:54 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct 02 07:11:54 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.77 BogoMIPS (lpj=2799886)
Oct 02 07:11:54 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct 02 07:11:54 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct 02 07:11:54 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct 02 07:11:54 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct 02 07:11:54 localhost kernel: Spectre V2 : Mitigation: Retpolines
Oct 02 07:11:54 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct 02 07:11:54 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct 02 07:11:54 localhost kernel: RETBleed: Mitigation: untrained return thunk
Oct 02 07:11:54 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct 02 07:11:54 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct 02 07:11:54 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct 02 07:11:54 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct 02 07:11:54 localhost kernel: x86/bugs: return thunk changed
Oct 02 07:11:54 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct 02 07:11:54 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct 02 07:11:54 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct 02 07:11:54 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct 02 07:11:54 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct 02 07:11:54 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct 02 07:11:54 localhost kernel: Freeing SMP alternatives memory: 40K
Oct 02 07:11:54 localhost kernel: pid_max: default: 32768 minimum: 301
Oct 02 07:11:54 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct 02 07:11:54 localhost kernel: landlock: Up and running.
Oct 02 07:11:54 localhost kernel: Yama: becoming mindful.
Oct 02 07:11:54 localhost kernel: SELinux:  Initializing.
Oct 02 07:11:54 localhost kernel: LSM support for eBPF active
Oct 02 07:11:54 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 02 07:11:54 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 02 07:11:54 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct 02 07:11:54 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct 02 07:11:54 localhost kernel: ... version:                0
Oct 02 07:11:54 localhost kernel: ... bit width:              48
Oct 02 07:11:54 localhost kernel: ... generic registers:      6
Oct 02 07:11:54 localhost kernel: ... value mask:             0000ffffffffffff
Oct 02 07:11:54 localhost kernel: ... max period:             00007fffffffffff
Oct 02 07:11:54 localhost kernel: ... fixed-purpose events:   0
Oct 02 07:11:54 localhost kernel: ... event mask:             000000000000003f
Oct 02 07:11:54 localhost kernel: signal: max sigframe size: 1776
Oct 02 07:11:54 localhost kernel: rcu: Hierarchical SRCU implementation.
Oct 02 07:11:54 localhost kernel: rcu:         Max phase no-delay instances is 400.
Oct 02 07:11:54 localhost kernel: smp: Bringing up secondary CPUs ...
Oct 02 07:11:54 localhost kernel: smpboot: x86: Booting SMP configuration:
Oct 02 07:11:54 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct 02 07:11:54 localhost kernel: smp: Brought up 1 node, 8 CPUs
Oct 02 07:11:54 localhost kernel: smpboot: Total of 8 processors activated (44798.17 BogoMIPS)
Oct 02 07:11:54 localhost kernel: node 0 deferred pages initialised in 19ms
Oct 02 07:11:54 localhost kernel: Memory: 7765548K/8388068K available (16384K kernel code, 5784K rwdata, 13996K rodata, 4068K init, 7304K bss, 616504K reserved, 0K cma-reserved)
Oct 02 07:11:54 localhost kernel: devtmpfs: initialized
Oct 02 07:11:54 localhost kernel: x86/mm: Memory block size: 128MB
Oct 02 07:11:54 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct 02 07:11:54 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct 02 07:11:54 localhost kernel: pinctrl core: initialized pinctrl subsystem
Oct 02 07:11:54 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct 02 07:11:54 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct 02 07:11:54 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct 02 07:11:54 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct 02 07:11:54 localhost kernel: audit: initializing netlink subsys (disabled)
Oct 02 07:11:54 localhost kernel: audit: type=2000 audit(1759389112.706:1): state=initialized audit_enabled=0 res=1
Oct 02 07:11:54 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct 02 07:11:54 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct 02 07:11:54 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Oct 02 07:11:54 localhost kernel: cpuidle: using governor menu
Oct 02 07:11:54 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct 02 07:11:54 localhost kernel: PCI: Using configuration type 1 for base access
Oct 02 07:11:54 localhost kernel: PCI: Using configuration type 1 for extended access
Oct 02 07:11:54 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct 02 07:11:54 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct 02 07:11:54 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct 02 07:11:54 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct 02 07:11:54 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct 02 07:11:54 localhost kernel: Demotion targets for Node 0: null
Oct 02 07:11:54 localhost kernel: cryptd: max_cpu_qlen set to 1000
Oct 02 07:11:54 localhost kernel: ACPI: Added _OSI(Module Device)
Oct 02 07:11:54 localhost kernel: ACPI: Added _OSI(Processor Device)
Oct 02 07:11:54 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct 02 07:11:54 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct 02 07:11:54 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct 02 07:11:54 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct 02 07:11:54 localhost kernel: ACPI: Interpreter enabled
Oct 02 07:11:54 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct 02 07:11:54 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Oct 02 07:11:54 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct 02 07:11:54 localhost kernel: PCI: Using E820 reservations for host bridge windows
Oct 02 07:11:54 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct 02 07:11:54 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct 02 07:11:54 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [3] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [4] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [5] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [6] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [7] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [8] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [9] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [10] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [11] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [12] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [13] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [14] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [15] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [16] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [17] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [18] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [19] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [20] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [21] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [22] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [23] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [24] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [25] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [26] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [27] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [28] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [29] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [30] registered
Oct 02 07:11:54 localhost kernel: acpiphp: Slot [31] registered
Oct 02 07:11:54 localhost kernel: PCI host bridge to bus 0000:00
Oct 02 07:11:54 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct 02 07:11:54 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct 02 07:11:54 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct 02 07:11:54 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct 02 07:11:54 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct 02 07:11:54 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct 02 07:11:54 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct 02 07:11:54 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct 02 07:11:54 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct 02 07:11:54 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct 02 07:11:54 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct 02 07:11:54 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct 02 07:11:54 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct 02 07:11:54 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 02 07:11:54 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct 02 07:11:54 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct 02 07:11:54 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct 02 07:11:54 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct 02 07:11:54 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct 02 07:11:54 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct 02 07:11:54 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct 02 07:11:54 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct 02 07:11:54 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct 02 07:11:54 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct 02 07:11:54 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct 02 07:11:54 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct 02 07:11:54 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct 02 07:11:54 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct 02 07:11:54 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct 02 07:11:54 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct 02 07:11:54 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct 02 07:11:54 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct 02 07:11:54 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct 02 07:11:54 localhost kernel: iommu: Default domain type: Translated
Oct 02 07:11:54 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct 02 07:11:54 localhost kernel: SCSI subsystem initialized
Oct 02 07:11:54 localhost kernel: ACPI: bus type USB registered
Oct 02 07:11:54 localhost kernel: usbcore: registered new interface driver usbfs
Oct 02 07:11:54 localhost kernel: usbcore: registered new interface driver hub
Oct 02 07:11:54 localhost kernel: usbcore: registered new device driver usb
Oct 02 07:11:54 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Oct 02 07:11:54 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct 02 07:11:54 localhost kernel: PTP clock support registered
Oct 02 07:11:54 localhost kernel: EDAC MC: Ver: 3.0.0
Oct 02 07:11:54 localhost kernel: NetLabel: Initializing
Oct 02 07:11:54 localhost kernel: NetLabel:  domain hash size = 128
Oct 02 07:11:54 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct 02 07:11:54 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Oct 02 07:11:54 localhost kernel: PCI: Using ACPI for IRQ routing
Oct 02 07:11:54 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Oct 02 07:11:54 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Oct 02 07:11:54 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Oct 02 07:11:54 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct 02 07:11:54 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct 02 07:11:54 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct 02 07:11:54 localhost kernel: vgaarb: loaded
Oct 02 07:11:54 localhost kernel: clocksource: Switched to clocksource kvm-clock
Oct 02 07:11:54 localhost kernel: VFS: Disk quotas dquot_6.6.0
Oct 02 07:11:54 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct 02 07:11:54 localhost kernel: pnp: PnP ACPI init
Oct 02 07:11:54 localhost kernel: pnp 00:03: [dma 2]
Oct 02 07:11:54 localhost kernel: pnp: PnP ACPI: found 5 devices
Oct 02 07:11:54 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct 02 07:11:54 localhost kernel: NET: Registered PF_INET protocol family
Oct 02 07:11:54 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct 02 07:11:54 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct 02 07:11:54 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct 02 07:11:54 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct 02 07:11:54 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct 02 07:11:54 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct 02 07:11:54 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct 02 07:11:54 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 02 07:11:54 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 02 07:11:54 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct 02 07:11:54 localhost kernel: NET: Registered PF_XDP protocol family
Oct 02 07:11:54 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct 02 07:11:54 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct 02 07:11:54 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct 02 07:11:54 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct 02 07:11:54 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct 02 07:11:54 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct 02 07:11:54 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct 02 07:11:54 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 76159 usecs
Oct 02 07:11:54 localhost kernel: PCI: CLS 0 bytes, default 64
Oct 02 07:11:54 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct 02 07:11:54 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct 02 07:11:54 localhost kernel: ACPI: bus type thunderbolt registered
Oct 02 07:11:54 localhost kernel: Trying to unpack rootfs image as initramfs...
Oct 02 07:11:54 localhost kernel: Initialise system trusted keyrings
Oct 02 07:11:54 localhost kernel: Key type blacklist registered
Oct 02 07:11:54 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct 02 07:11:54 localhost kernel: zbud: loaded
Oct 02 07:11:54 localhost kernel: integrity: Platform Keyring initialized
Oct 02 07:11:54 localhost kernel: integrity: Machine keyring initialized
Oct 02 07:11:54 localhost kernel: Freeing initrd memory: 86104K
Oct 02 07:11:54 localhost kernel: NET: Registered PF_ALG protocol family
Oct 02 07:11:54 localhost kernel: xor: automatically using best checksumming function   avx       
Oct 02 07:11:54 localhost kernel: Key type asymmetric registered
Oct 02 07:11:54 localhost kernel: Asymmetric key parser 'x509' registered
Oct 02 07:11:54 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct 02 07:11:54 localhost kernel: io scheduler mq-deadline registered
Oct 02 07:11:54 localhost kernel: io scheduler kyber registered
Oct 02 07:11:54 localhost kernel: io scheduler bfq registered
Oct 02 07:11:54 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct 02 07:11:54 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct 02 07:11:54 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct 02 07:11:54 localhost kernel: ACPI: button: Power Button [PWRF]
Oct 02 07:11:54 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct 02 07:11:54 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct 02 07:11:54 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct 02 07:11:54 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct 02 07:11:54 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct 02 07:11:54 localhost kernel: Non-volatile memory driver v1.3
Oct 02 07:11:54 localhost kernel: rdac: device handler registered
Oct 02 07:11:54 localhost kernel: hp_sw: device handler registered
Oct 02 07:11:54 localhost kernel: emc: device handler registered
Oct 02 07:11:54 localhost kernel: alua: device handler registered
Oct 02 07:11:54 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct 02 07:11:54 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct 02 07:11:54 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct 02 07:11:54 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct 02 07:11:54 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct 02 07:11:54 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct 02 07:11:54 localhost kernel: usb usb1: Product: UHCI Host Controller
Oct 02 07:11:54 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-620.el9.x86_64 uhci_hcd
Oct 02 07:11:54 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct 02 07:11:54 localhost kernel: hub 1-0:1.0: USB hub found
Oct 02 07:11:54 localhost kernel: hub 1-0:1.0: 2 ports detected
Oct 02 07:11:54 localhost kernel: usbcore: registered new interface driver usbserial_generic
Oct 02 07:11:54 localhost kernel: usbserial: USB Serial support registered for generic
Oct 02 07:11:54 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct 02 07:11:54 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct 02 07:11:54 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct 02 07:11:54 localhost kernel: mousedev: PS/2 mouse device common for all mice
Oct 02 07:11:54 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Oct 02 07:11:54 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct 02 07:11:54 localhost kernel: rtc_cmos 00:04: registered as rtc0
Oct 02 07:11:54 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-10-02T07:11:53 UTC (1759389113)
Oct 02 07:11:54 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct 02 07:11:54 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct 02 07:11:54 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Oct 02 07:11:54 localhost kernel: usbcore: registered new interface driver usbhid
Oct 02 07:11:54 localhost kernel: usbhid: USB HID core driver
Oct 02 07:11:54 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct 02 07:11:54 localhost kernel: drop_monitor: Initializing network drop monitor service
Oct 02 07:11:54 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct 02 07:11:54 localhost kernel: Initializing XFRM netlink socket
Oct 02 07:11:54 localhost kernel: NET: Registered PF_INET6 protocol family
Oct 02 07:11:54 localhost kernel: Segment Routing with IPv6
Oct 02 07:11:54 localhost kernel: NET: Registered PF_PACKET protocol family
Oct 02 07:11:54 localhost kernel: mpls_gso: MPLS GSO support
Oct 02 07:11:54 localhost kernel: IPI shorthand broadcast: enabled
Oct 02 07:11:54 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Oct 02 07:11:54 localhost kernel: AES CTR mode by8 optimization enabled
Oct 02 07:11:54 localhost kernel: sched_clock: Marking stable (1208005372, 139262560)->(1462049944, -114782012)
Oct 02 07:11:54 localhost kernel: registered taskstats version 1
Oct 02 07:11:54 localhost kernel: Loading compiled-in X.509 certificates
Oct 02 07:11:54 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct 02 07:11:54 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct 02 07:11:54 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct 02 07:11:54 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct 02 07:11:54 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct 02 07:11:54 localhost kernel: Demotion targets for Node 0: null
Oct 02 07:11:54 localhost kernel: page_owner is disabled
Oct 02 07:11:54 localhost kernel: Key type .fscrypt registered
Oct 02 07:11:54 localhost kernel: Key type fscrypt-provisioning registered
Oct 02 07:11:54 localhost kernel: Key type big_key registered
Oct 02 07:11:54 localhost kernel: Key type encrypted registered
Oct 02 07:11:54 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Oct 02 07:11:54 localhost kernel: Loading compiled-in module X.509 certificates
Oct 02 07:11:54 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct 02 07:11:54 localhost kernel: ima: Allocated hash algorithm: sha256
Oct 02 07:11:54 localhost kernel: ima: No architecture policies found
Oct 02 07:11:54 localhost kernel: evm: Initialising EVM extended attributes:
Oct 02 07:11:54 localhost kernel: evm: security.selinux
Oct 02 07:11:54 localhost kernel: evm: security.SMACK64 (disabled)
Oct 02 07:11:54 localhost kernel: evm: security.SMACK64EXEC (disabled)
Oct 02 07:11:54 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct 02 07:11:54 localhost kernel: evm: security.SMACK64MMAP (disabled)
Oct 02 07:11:54 localhost kernel: evm: security.apparmor (disabled)
Oct 02 07:11:54 localhost kernel: evm: security.ima
Oct 02 07:11:54 localhost kernel: evm: security.capability
Oct 02 07:11:54 localhost kernel: evm: HMAC attrs: 0x1
Oct 02 07:11:54 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct 02 07:11:54 localhost kernel: Running certificate verification RSA selftest
Oct 02 07:11:54 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct 02 07:11:54 localhost kernel: Running certificate verification ECDSA selftest
Oct 02 07:11:54 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct 02 07:11:54 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct 02 07:11:54 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct 02 07:11:54 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Oct 02 07:11:54 localhost kernel: usb 1-1: Manufacturer: QEMU
Oct 02 07:11:54 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct 02 07:11:54 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct 02 07:11:54 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct 02 07:11:54 localhost kernel: clk: Disabling unused clocks
Oct 02 07:11:54 localhost kernel: Freeing unused decrypted memory: 2028K
Oct 02 07:11:54 localhost kernel: Freeing unused kernel image (initmem) memory: 4068K
Oct 02 07:11:54 localhost kernel: Write protecting the kernel read-only data: 30720k
Oct 02 07:11:54 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 340K
Oct 02 07:11:54 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct 02 07:11:54 localhost kernel: Run /init as init process
Oct 02 07:11:54 localhost kernel:   with arguments:
Oct 02 07:11:54 localhost kernel:     /init
Oct 02 07:11:54 localhost kernel:   with environment:
Oct 02 07:11:54 localhost kernel:     HOME=/
Oct 02 07:11:54 localhost kernel:     TERM=linux
Oct 02 07:11:54 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64
Oct 02 07:11:54 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 02 07:11:54 localhost systemd[1]: Detected virtualization kvm.
Oct 02 07:11:54 localhost systemd[1]: Detected architecture x86-64.
Oct 02 07:11:54 localhost systemd[1]: Running in initrd.
Oct 02 07:11:54 localhost systemd[1]: No hostname configured, using default hostname.
Oct 02 07:11:54 localhost systemd[1]: Hostname set to <localhost>.
Oct 02 07:11:54 localhost systemd[1]: Initializing machine ID from VM UUID.
Oct 02 07:11:54 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Oct 02 07:11:54 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 02 07:11:54 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 02 07:11:54 localhost systemd[1]: Reached target Initrd /usr File System.
Oct 02 07:11:54 localhost systemd[1]: Reached target Local File Systems.
Oct 02 07:11:54 localhost systemd[1]: Reached target Path Units.
Oct 02 07:11:54 localhost systemd[1]: Reached target Slice Units.
Oct 02 07:11:54 localhost systemd[1]: Reached target Swaps.
Oct 02 07:11:54 localhost systemd[1]: Reached target Timer Units.
Oct 02 07:11:54 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 02 07:11:54 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Oct 02 07:11:54 localhost systemd[1]: Listening on Journal Socket.
Oct 02 07:11:54 localhost systemd[1]: Listening on udev Control Socket.
Oct 02 07:11:54 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 02 07:11:54 localhost systemd[1]: Reached target Socket Units.
Oct 02 07:11:54 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 02 07:11:54 localhost systemd[1]: Starting Journal Service...
Oct 02 07:11:54 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 02 07:11:54 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 02 07:11:54 localhost systemd[1]: Starting Create System Users...
Oct 02 07:11:54 localhost systemd[1]: Starting Setup Virtual Console...
Oct 02 07:11:54 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 02 07:11:54 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 02 07:11:54 localhost systemd[1]: Finished Create System Users.
Oct 02 07:11:54 localhost systemd-journald[311]: Journal started
Oct 02 07:11:54 localhost systemd-journald[311]: Runtime Journal (/run/log/journal/b127680ea52a46b096f23ca4a3f61658) is 8.0M, max 153.5M, 145.5M free.
Oct 02 07:11:54 localhost systemd-sysusers[316]: Creating group 'users' with GID 100.
Oct 02 07:11:54 localhost systemd-sysusers[316]: Creating group 'dbus' with GID 81.
Oct 02 07:11:54 localhost systemd-sysusers[316]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct 02 07:11:54 localhost systemd[1]: Started Journal Service.
Oct 02 07:11:54 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 02 07:11:54 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 02 07:11:54 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 02 07:11:54 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 02 07:11:54 localhost systemd[1]: Finished Setup Virtual Console.
Oct 02 07:11:54 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct 02 07:11:54 localhost systemd[1]: Starting dracut cmdline hook...
Oct 02 07:11:54 localhost dracut-cmdline[331]: dracut-9 dracut-057-102.git20250818.el9
Oct 02 07:11:54 localhost dracut-cmdline[331]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 02 07:11:54 localhost systemd[1]: Finished dracut cmdline hook.
Oct 02 07:11:54 localhost systemd[1]: Starting dracut pre-udev hook...
Oct 02 07:11:54 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct 02 07:11:54 localhost kernel: device-mapper: uevent: version 1.0.3
Oct 02 07:11:54 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct 02 07:11:54 localhost kernel: RPC: Registered named UNIX socket transport module.
Oct 02 07:11:54 localhost kernel: RPC: Registered udp transport module.
Oct 02 07:11:54 localhost kernel: RPC: Registered tcp transport module.
Oct 02 07:11:54 localhost kernel: RPC: Registered tcp-with-tls transport module.
Oct 02 07:11:54 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct 02 07:11:54 localhost rpc.statd[448]: Version 2.5.4 starting
Oct 02 07:11:54 localhost rpc.statd[448]: Initializing NSM state
Oct 02 07:11:54 localhost rpc.idmapd[453]: Setting log level to 0
Oct 02 07:11:54 localhost systemd[1]: Finished dracut pre-udev hook.
Oct 02 07:11:54 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 02 07:11:54 localhost systemd-udevd[466]: Using default interface naming scheme 'rhel-9.0'.
Oct 02 07:11:54 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 02 07:11:55 localhost systemd[1]: Starting dracut pre-trigger hook...
Oct 02 07:11:55 localhost systemd[1]: Finished dracut pre-trigger hook.
Oct 02 07:11:55 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 02 07:11:55 localhost systemd[1]: Created slice Slice /system/modprobe.
Oct 02 07:11:55 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 02 07:11:55 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 02 07:11:55 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 02 07:11:55 localhost systemd[1]: Reached target Network.
Oct 02 07:11:55 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 02 07:11:55 localhost systemd[1]: Starting dracut initqueue hook...
Oct 02 07:11:55 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 02 07:11:55 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 02 07:11:55 localhost systemd[1]: Mounting Kernel Configuration File System...
Oct 02 07:11:55 localhost systemd[1]: Mounted Kernel Configuration File System.
Oct 02 07:11:55 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct 02 07:11:55 localhost systemd[1]: Reached target System Initialization.
Oct 02 07:11:55 localhost systemd[1]: Reached target Basic System.
Oct 02 07:11:55 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct 02 07:11:55 localhost kernel:  vda: vda1
Oct 02 07:11:55 localhost systemd-udevd[471]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 07:11:55 localhost kernel: libata version 3.00 loaded.
Oct 02 07:11:55 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Oct 02 07:11:55 localhost kernel: scsi host0: ata_piix
Oct 02 07:11:55 localhost kernel: scsi host1: ata_piix
Oct 02 07:11:55 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct 02 07:11:55 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct 02 07:11:55 localhost systemd[1]: Found device /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct 02 07:11:55 localhost systemd[1]: Reached target Initrd Root Device.
Oct 02 07:11:55 localhost kernel: ata1: found unknown device (class 0)
Oct 02 07:11:55 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct 02 07:11:55 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct 02 07:11:55 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct 02 07:11:55 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct 02 07:11:55 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct 02 07:11:55 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Oct 02 07:11:55 localhost systemd[1]: Finished dracut initqueue hook.
Oct 02 07:11:55 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Oct 02 07:11:55 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Oct 02 07:11:55 localhost systemd[1]: Reached target Remote File Systems.
Oct 02 07:11:55 localhost systemd[1]: Starting dracut pre-mount hook...
Oct 02 07:11:55 localhost systemd[1]: Finished dracut pre-mount hook.
Oct 02 07:11:55 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458...
Oct 02 07:11:55 localhost systemd-fsck[561]: /usr/sbin/fsck.xfs: XFS file system.
Oct 02 07:11:55 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct 02 07:11:55 localhost systemd[1]: Mounting /sysroot...
Oct 02 07:11:56 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct 02 07:11:56 localhost kernel: XFS (vda1): Mounting V5 Filesystem 1631a6ad-43b8-436d-ae76-16fa14b94458
Oct 02 07:11:56 localhost kernel: XFS (vda1): Ending clean mount
Oct 02 07:11:56 localhost systemd[1]: Mounted /sysroot.
Oct 02 07:11:56 localhost systemd[1]: Reached target Initrd Root File System.
Oct 02 07:11:56 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct 02 07:11:56 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct 02 07:11:56 localhost systemd[1]: Reached target Initrd File Systems.
Oct 02 07:11:56 localhost systemd[1]: Reached target Initrd Default Target.
Oct 02 07:11:56 localhost systemd[1]: Starting dracut mount hook...
Oct 02 07:11:56 localhost systemd[1]: Finished dracut mount hook.
Oct 02 07:11:56 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct 02 07:11:56 localhost rpc.idmapd[453]: exiting on signal 15
Oct 02 07:11:56 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct 02 07:11:56 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct 02 07:11:56 localhost systemd[1]: Stopped target Network.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Timer Units.
Oct 02 07:11:56 localhost systemd[1]: dbus.socket: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Oct 02 07:11:56 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Initrd Default Target.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Basic System.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Initrd Root Device.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Initrd /usr File System.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Path Units.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Remote File Systems.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Slice Units.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Socket Units.
Oct 02 07:11:56 localhost systemd[1]: Stopped target System Initialization.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Local File Systems.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Swaps.
Oct 02 07:11:56 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped dracut mount hook.
Oct 02 07:11:56 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped dracut pre-mount hook.
Oct 02 07:11:56 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Oct 02 07:11:56 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct 02 07:11:56 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped dracut initqueue hook.
Oct 02 07:11:56 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped Apply Kernel Variables.
Oct 02 07:11:56 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Oct 02 07:11:56 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped Coldplug All udev Devices.
Oct 02 07:11:56 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped dracut pre-trigger hook.
Oct 02 07:11:56 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct 02 07:11:56 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped Setup Virtual Console.
Oct 02 07:11:56 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct 02 07:11:56 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct 02 07:11:56 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Closed udev Control Socket.
Oct 02 07:11:56 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Closed udev Kernel Socket.
Oct 02 07:11:56 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped dracut pre-udev hook.
Oct 02 07:11:56 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped dracut cmdline hook.
Oct 02 07:11:56 localhost systemd[1]: Starting Cleanup udev Database...
Oct 02 07:11:56 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct 02 07:11:56 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Oct 02 07:11:56 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Stopped Create System Users.
Oct 02 07:11:56 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct 02 07:11:56 localhost systemd[1]: Finished Cleanup udev Database.
Oct 02 07:11:56 localhost systemd[1]: Reached target Switch Root.
Oct 02 07:11:56 localhost systemd[1]: Starting Switch Root...
Oct 02 07:11:56 localhost systemd[1]: Switching root.
Oct 02 07:11:56 localhost systemd-journald[311]: Journal stopped
Oct 02 07:11:57 localhost systemd-journald[311]: Received SIGTERM from PID 1 (systemd).
Oct 02 07:11:57 localhost kernel: audit: type=1404 audit(1759389116.787:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct 02 07:11:57 localhost kernel: SELinux:  policy capability network_peer_controls=1
Oct 02 07:11:57 localhost kernel: SELinux:  policy capability open_perms=1
Oct 02 07:11:57 localhost kernel: SELinux:  policy capability extended_socket_class=1
Oct 02 07:11:57 localhost kernel: SELinux:  policy capability always_check_network=0
Oct 02 07:11:57 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 02 07:11:57 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 02 07:11:57 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 02 07:11:57 localhost kernel: audit: type=1403 audit(1759389116.964:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct 02 07:11:57 localhost systemd[1]: Successfully loaded SELinux policy in 180.873ms.
Oct 02 07:11:57 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 28.027ms.
Oct 02 07:11:57 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 02 07:11:57 localhost systemd[1]: Detected virtualization kvm.
Oct 02 07:11:57 localhost systemd[1]: Detected architecture x86-64.
Oct 02 07:11:57 localhost systemd-rc-local-generator[642]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:11:57 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Oct 02 07:11:57 localhost systemd[1]: Stopped Switch Root.
Oct 02 07:11:57 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct 02 07:11:57 localhost systemd[1]: Created slice Slice /system/getty.
Oct 02 07:11:57 localhost systemd[1]: Created slice Slice /system/serial-getty.
Oct 02 07:11:57 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Oct 02 07:11:57 localhost systemd[1]: Created slice User and Session Slice.
Oct 02 07:11:57 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 02 07:11:57 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Oct 02 07:11:57 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct 02 07:11:57 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 02 07:11:57 localhost systemd[1]: Stopped target Switch Root.
Oct 02 07:11:57 localhost systemd[1]: Stopped target Initrd File Systems.
Oct 02 07:11:57 localhost systemd[1]: Stopped target Initrd Root File System.
Oct 02 07:11:57 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Oct 02 07:11:57 localhost systemd[1]: Reached target Path Units.
Oct 02 07:11:57 localhost systemd[1]: Reached target rpc_pipefs.target.
Oct 02 07:11:57 localhost systemd[1]: Reached target Slice Units.
Oct 02 07:11:57 localhost systemd[1]: Reached target Swaps.
Oct 02 07:11:57 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Oct 02 07:11:57 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Oct 02 07:11:57 localhost systemd[1]: Reached target RPC Port Mapper.
Oct 02 07:11:57 localhost systemd[1]: Listening on Process Core Dump Socket.
Oct 02 07:11:57 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Oct 02 07:11:57 localhost systemd[1]: Listening on udev Control Socket.
Oct 02 07:11:57 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 02 07:11:57 localhost systemd[1]: Mounting Huge Pages File System...
Oct 02 07:11:57 localhost systemd[1]: Mounting POSIX Message Queue File System...
Oct 02 07:11:57 localhost systemd[1]: Mounting Kernel Debug File System...
Oct 02 07:11:57 localhost systemd[1]: Mounting Kernel Trace File System...
Oct 02 07:11:57 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 02 07:11:57 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 02 07:11:57 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 02 07:11:57 localhost systemd[1]: Starting Load Kernel Module drm...
Oct 02 07:11:57 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Oct 02 07:11:57 localhost systemd[1]: Starting Load Kernel Module fuse...
Oct 02 07:11:57 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct 02 07:11:57 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Oct 02 07:11:57 localhost systemd[1]: Stopped File System Check on Root Device.
Oct 02 07:11:57 localhost systemd[1]: Stopped Journal Service.
Oct 02 07:11:57 localhost systemd[1]: Starting Journal Service...
Oct 02 07:11:57 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 02 07:11:57 localhost systemd[1]: Starting Generate network units from Kernel command line...
Oct 02 07:11:57 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 02 07:11:57 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Oct 02 07:11:57 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct 02 07:11:57 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 02 07:11:57 localhost kernel: fuse: init (API version 7.37)
Oct 02 07:11:57 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 02 07:11:57 localhost systemd[1]: Mounted Huge Pages File System.
Oct 02 07:11:57 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct 02 07:11:57 localhost systemd-journald[683]: Journal started
Oct 02 07:11:57 localhost systemd-journald[683]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct 02 07:11:57 localhost systemd[1]: Queued start job for default target Multi-User System.
Oct 02 07:11:57 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Oct 02 07:11:57 localhost systemd[1]: Started Journal Service.
Oct 02 07:11:57 localhost systemd[1]: Mounted POSIX Message Queue File System.
Oct 02 07:11:57 localhost systemd[1]: Mounted Kernel Debug File System.
Oct 02 07:11:57 localhost systemd[1]: Mounted Kernel Trace File System.
Oct 02 07:11:57 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 02 07:11:57 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 02 07:11:57 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 02 07:11:57 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct 02 07:11:57 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Oct 02 07:11:57 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct 02 07:11:57 localhost systemd[1]: Finished Load Kernel Module fuse.
Oct 02 07:11:57 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct 02 07:11:57 localhost systemd[1]: Finished Generate network units from Kernel command line.
Oct 02 07:11:57 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Oct 02 07:11:57 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 02 07:11:57 localhost kernel: ACPI: bus type drm_connector registered
Oct 02 07:11:57 localhost systemd[1]: Mounting FUSE Control File System...
Oct 02 07:11:57 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 02 07:11:57 localhost systemd[1]: Starting Rebuild Hardware Database...
Oct 02 07:11:57 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Oct 02 07:11:57 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct 02 07:11:57 localhost systemd[1]: Starting Load/Save OS Random Seed...
Oct 02 07:11:57 localhost systemd[1]: Starting Create System Users...
Oct 02 07:11:57 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct 02 07:11:57 localhost systemd[1]: Finished Load Kernel Module drm.
Oct 02 07:11:57 localhost systemd[1]: Mounted FUSE Control File System.
Oct 02 07:11:57 localhost systemd-journald[683]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct 02 07:11:57 localhost systemd-journald[683]: Received client request to flush runtime journal.
Oct 02 07:11:57 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Oct 02 07:11:57 localhost systemd[1]: Finished Load/Save OS Random Seed.
Oct 02 07:11:57 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 02 07:11:57 localhost systemd[1]: Finished Create System Users.
Oct 02 07:11:57 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 02 07:11:57 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 02 07:11:57 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 02 07:11:57 localhost systemd[1]: Reached target Preparation for Local File Systems.
Oct 02 07:11:57 localhost systemd[1]: Reached target Local File Systems.
Oct 02 07:11:57 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct 02 07:11:58 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct 02 07:11:58 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct 02 07:11:58 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct 02 07:11:58 localhost systemd[1]: Starting Automatic Boot Loader Update...
Oct 02 07:11:58 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct 02 07:11:58 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 02 07:11:58 localhost bootctl[703]: Couldn't find EFI system partition, skipping.
Oct 02 07:11:58 localhost systemd[1]: Finished Automatic Boot Loader Update.
Oct 02 07:11:58 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 02 07:11:58 localhost systemd[1]: Starting Security Auditing Service...
Oct 02 07:11:58 localhost systemd[1]: Starting RPC Bind...
Oct 02 07:11:58 localhost systemd[1]: Starting Rebuild Journal Catalog...
Oct 02 07:11:58 localhost auditd[709]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct 02 07:11:58 localhost auditd[709]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct 02 07:11:58 localhost systemd[1]: Started RPC Bind.
Oct 02 07:11:58 localhost augenrules[714]: /sbin/augenrules: No change
Oct 02 07:11:58 localhost systemd[1]: Finished Rebuild Journal Catalog.
Oct 02 07:11:58 localhost augenrules[729]: No rules
Oct 02 07:11:58 localhost augenrules[729]: enabled 1
Oct 02 07:11:58 localhost augenrules[729]: failure 1
Oct 02 07:11:58 localhost augenrules[729]: pid 709
Oct 02 07:11:58 localhost augenrules[729]: rate_limit 0
Oct 02 07:11:58 localhost augenrules[729]: backlog_limit 8192
Oct 02 07:11:58 localhost augenrules[729]: lost 0
Oct 02 07:11:58 localhost augenrules[729]: backlog 1
Oct 02 07:11:58 localhost augenrules[729]: backlog_wait_time 60000
Oct 02 07:11:58 localhost augenrules[729]: backlog_wait_time_actual 0
Oct 02 07:11:58 localhost augenrules[729]: enabled 1
Oct 02 07:11:58 localhost augenrules[729]: failure 1
Oct 02 07:11:58 localhost augenrules[729]: pid 709
Oct 02 07:11:58 localhost augenrules[729]: rate_limit 0
Oct 02 07:11:58 localhost augenrules[729]: backlog_limit 8192
Oct 02 07:11:58 localhost augenrules[729]: lost 0
Oct 02 07:11:58 localhost augenrules[729]: backlog 0
Oct 02 07:11:58 localhost augenrules[729]: backlog_wait_time 60000
Oct 02 07:11:58 localhost augenrules[729]: backlog_wait_time_actual 0
Oct 02 07:11:58 localhost augenrules[729]: enabled 1
Oct 02 07:11:58 localhost augenrules[729]: failure 1
Oct 02 07:11:58 localhost augenrules[729]: pid 709
Oct 02 07:11:58 localhost augenrules[729]: rate_limit 0
Oct 02 07:11:58 localhost augenrules[729]: backlog_limit 8192
Oct 02 07:11:58 localhost augenrules[729]: lost 0
Oct 02 07:11:58 localhost augenrules[729]: backlog 0
Oct 02 07:11:58 localhost augenrules[729]: backlog_wait_time 60000
Oct 02 07:11:58 localhost augenrules[729]: backlog_wait_time_actual 0
Oct 02 07:11:58 localhost systemd[1]: Started Security Auditing Service.
Oct 02 07:11:58 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct 02 07:11:58 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct 02 07:11:58 localhost systemd[1]: Finished Rebuild Hardware Database.
Oct 02 07:11:58 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 02 07:11:58 localhost systemd-udevd[737]: Using default interface naming scheme 'rhel-9.0'.
Oct 02 07:11:58 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct 02 07:11:58 localhost systemd[1]: Starting Update is Completed...
Oct 02 07:11:58 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 02 07:11:58 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 02 07:11:58 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct 02 07:11:58 localhost systemd[1]: Finished Update is Completed.
Oct 02 07:11:58 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 02 07:11:58 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 02 07:11:58 localhost systemd-udevd[739]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 07:11:58 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct 02 07:11:58 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct 02 07:11:58 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct 02 07:11:58 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct 02 07:11:58 localhost systemd[1]: Reached target System Initialization.
Oct 02 07:11:58 localhost systemd[1]: Started dnf makecache --timer.
Oct 02 07:11:58 localhost systemd[1]: Started Daily rotation of log files.
Oct 02 07:11:58 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct 02 07:11:58 localhost systemd[1]: Reached target Timer Units.
Oct 02 07:11:58 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 02 07:11:58 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct 02 07:11:58 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct 02 07:11:58 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct 02 07:11:58 localhost kernel: kvm_amd: TSC scaling supported
Oct 02 07:11:58 localhost kernel: kvm_amd: Nested Virtualization enabled
Oct 02 07:11:58 localhost kernel: kvm_amd: Nested Paging enabled
Oct 02 07:11:58 localhost kernel: kvm_amd: LBR virtualization supported
Oct 02 07:11:58 localhost systemd[1]: Reached target Socket Units.
Oct 02 07:11:58 localhost kernel: Console: switching to colour dummy device 80x25
Oct 02 07:11:58 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct 02 07:11:58 localhost kernel: [drm] features: -context_init
Oct 02 07:11:58 localhost kernel: [drm] number of scanouts: 1
Oct 02 07:11:58 localhost kernel: [drm] number of cap sets: 0
Oct 02 07:11:58 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct 02 07:11:58 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct 02 07:11:58 localhost kernel: Console: switching to colour frame buffer device 128x48
Oct 02 07:11:58 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct 02 07:11:58 localhost systemd[1]: Starting D-Bus System Message Bus...
Oct 02 07:11:58 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 02 07:11:58 localhost systemd[1]: Started D-Bus System Message Bus.
Oct 02 07:11:58 localhost dbus-broker-lau[793]: Ready
Oct 02 07:11:58 localhost systemd[1]: Reached target Basic System.
Oct 02 07:11:58 localhost systemd[1]: Starting NTP client/server...
Oct 02 07:11:58 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct 02 07:11:58 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct 02 07:11:58 localhost systemd[1]: Starting IPv4 firewall with iptables...
Oct 02 07:11:58 localhost systemd[1]: Started irqbalance daemon.
Oct 02 07:11:58 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct 02 07:11:58 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 02 07:11:58 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 02 07:11:58 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 02 07:11:58 localhost systemd[1]: Reached target sshd-keygen.target.
Oct 02 07:11:58 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct 02 07:11:58 localhost systemd[1]: Reached target User and Group Name Lookups.
Oct 02 07:11:58 localhost systemd[1]: Starting User Login Management...
Oct 02 07:11:58 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct 02 07:11:58 localhost chronyd[835]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 02 07:11:59 localhost chronyd[835]: Loaded 0 symmetric keys
Oct 02 07:11:59 localhost chronyd[835]: Using right/UTC timezone to obtain leap second data
Oct 02 07:11:59 localhost chronyd[835]: Loaded seccomp filter (level 2)
Oct 02 07:11:59 localhost systemd[1]: Started NTP client/server.
Oct 02 07:11:59 localhost systemd-logind[827]: New seat seat0.
Oct 02 07:11:59 localhost systemd-logind[827]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 02 07:11:59 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct 02 07:11:59 localhost systemd-logind[827]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 02 07:11:59 localhost systemd[1]: Started User Login Management.
Oct 02 07:11:59 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct 02 07:11:59 localhost iptables.init[822]: iptables: Applying firewall rules: [  OK  ]
Oct 02 07:11:59 localhost systemd[1]: Finished IPv4 firewall with iptables.
Oct 02 07:11:59 localhost cloud-init[845]: Cloud-init v. 24.4-7.el9 running 'init-local' at Thu, 02 Oct 2025 07:11:59 +0000. Up 7.31 seconds.
Oct 02 07:11:59 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Oct 02 07:11:59 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Oct 02 07:11:59 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpggfkoao3.mount: Deactivated successfully.
Oct 02 07:12:00 localhost systemd[1]: Starting Hostname Service...
Oct 02 07:12:00 localhost systemd[1]: Started Hostname Service.
Oct 02 07:12:00 np0005465596.novalocal systemd-hostnamed[859]: Hostname set to <np0005465596.novalocal> (static)
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Reached target Preparation for Network.
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Starting Network Manager...
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.2680] NetworkManager (version 1.54.1-1.el9) is starting... (boot:f022b9c3-1b0c-4cc3-9fcc-f643153c9b0a)
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.2685] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.2842] manager[0x55a35a509080]: monitoring kernel firmware directory '/lib/firmware'.
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.2909] hostname: hostname: using hostnamed
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.2909] hostname: static hostname changed from (none) to "np0005465596.novalocal"
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.2915] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3103] manager[0x55a35a509080]: rfkill: Wi-Fi hardware radio set enabled
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3104] manager[0x55a35a509080]: rfkill: WWAN hardware radio set enabled
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3185] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3185] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3186] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3186] manager: Networking is enabled by state file
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3188] settings: Loaded settings plugin: keyfile (internal)
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3218] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3239] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3274] dhcp: init: Using DHCP client 'internal'
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3277] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3288] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3300] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3307] device (lo): Activation: starting connection 'lo' (77cf1f15-4a84-4c7c-ae0f-bf80f6a18c78)
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3315] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3319] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3342] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3346] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3348] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3351] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3353] device (eth0): carrier: link connected
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3356] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3362] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3367] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3372] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3373] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3375] manager: NetworkManager state is now CONNECTING
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3376] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3381] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3384] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Started Network Manager.
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Reached target Network.
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Starting Network Manager Wait Online...
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3693] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3695] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 02 07:12:00 np0005465596.novalocal NetworkManager[863]: <info>  [1759389120.3700] device (lo): Activation: successful, device activated.
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Reached target NFS client services.
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: Reached target Remote File Systems.
Oct 02 07:12:00 np0005465596.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 02 07:12:01 np0005465596.novalocal NetworkManager[863]: <info>  [1759389121.4162] dhcp4 (eth0): state changed new lease, address=38.102.83.73
Oct 02 07:12:01 np0005465596.novalocal NetworkManager[863]: <info>  [1759389121.4178] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 02 07:12:01 np0005465596.novalocal NetworkManager[863]: <info>  [1759389121.4213] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:12:01 np0005465596.novalocal NetworkManager[863]: <info>  [1759389121.4259] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:12:01 np0005465596.novalocal NetworkManager[863]: <info>  [1759389121.4262] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:12:01 np0005465596.novalocal NetworkManager[863]: <info>  [1759389121.4265] manager: NetworkManager state is now CONNECTED_SITE
Oct 02 07:12:01 np0005465596.novalocal NetworkManager[863]: <info>  [1759389121.4269] device (eth0): Activation: successful, device activated.
Oct 02 07:12:01 np0005465596.novalocal NetworkManager[863]: <info>  [1759389121.4275] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 02 07:12:01 np0005465596.novalocal NetworkManager[863]: <info>  [1759389121.4280] manager: startup complete
Oct 02 07:12:01 np0005465596.novalocal systemd[1]: Finished Network Manager Wait Online.
Oct 02 07:12:01 np0005465596.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: Cloud-init v. 24.4-7.el9 running 'init' at Thu, 02 Oct 2025 07:12:01 +0000. Up 9.37 seconds.
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: |  eth0  | True |         38.102.83.73         | 255.255.255.0 | global | fa:16:3e:9d:76:57 |
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: |  eth0  | True | fe80::f816:3eff:fe9d:7657/64 |       .       |  link  | fa:16:3e:9d:76:57 |
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Oct 02 07:12:01 np0005465596.novalocal cloud-init[927]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 02 07:12:02 np0005465596.novalocal useradd[994]: new group: name=cloud-user, GID=1001
Oct 02 07:12:02 np0005465596.novalocal useradd[994]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Oct 02 07:12:02 np0005465596.novalocal useradd[994]: add 'cloud-user' to group 'adm'
Oct 02 07:12:02 np0005465596.novalocal useradd[994]: add 'cloud-user' to group 'systemd-journal'
Oct 02 07:12:02 np0005465596.novalocal useradd[994]: add 'cloud-user' to shadow group 'adm'
Oct 02 07:12:02 np0005465596.novalocal useradd[994]: add 'cloud-user' to shadow group 'systemd-journal'
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: Generating public/private rsa key pair.
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: The key fingerprint is:
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: SHA256:V+BUwwONX32CV8YmExxR0xpVlXagPFRplgWKvoo75Us root@np0005465596.novalocal
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: The key's randomart image is:
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: +---[RSA 3072]----+
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |          +*++B^&|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |         o.==+&+O|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |          o.**.Bo|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |         . ....  |
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |        S o      |
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |        .. .     |
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |       oE .      |
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |      .o..       |
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |      ooo.       |
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: +----[SHA256]-----+
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: Generating public/private ecdsa key pair.
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: The key fingerprint is:
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: SHA256:aXd/xtrhPXNrk1uRhy6jBGYV9I8IWDxUA8XfDJlZVKU root@np0005465596.novalocal
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: The key's randomart image is:
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: +---[ECDSA 256]---+
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |       o+**  *o.+|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |       oo .+=  . |
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |      . .....+E  |
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |         + ..oo..|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |        S o o oo.|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |       + o . o .o|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |          . o o.*|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |         . . o.@=|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |          .   o+X|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: +----[SHA256]-----+
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: Generating public/private ed25519 key pair.
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: The key fingerprint is:
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: SHA256:BFecLdyCJnlWsogf0t6aR0IXc5tFXVHgBf/hBbde1Ag root@np0005465596.novalocal
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: The key's randomart image is:
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: +--[ED25519 256]--+
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |      ..=*+=Eo+*X|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |     oo+=**+o.o*o|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |    o ==+ oo  .o+|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |     = =      o =|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |      + S      o.|
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |       =         |
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |      o .        |
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |       .         |
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: |                 |
Oct 02 07:12:03 np0005465596.novalocal cloud-init[927]: +----[SHA256]-----+
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Reached target Cloud-config availability.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Reached target Network is Online.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Starting System Logging Service...
Oct 02 07:12:03 np0005465596.novalocal sm-notify[1009]: Version 2.5.4 starting
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Starting OpenSSH server daemon...
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Starting Permit User Sessions...
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Started Notify NFS peers of a restart.
Oct 02 07:12:03 np0005465596.novalocal sshd[1011]: Server listening on 0.0.0.0 port 22.
Oct 02 07:12:03 np0005465596.novalocal sshd[1011]: Server listening on :: port 22.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Started OpenSSH server daemon.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Finished Permit User Sessions.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Started Command Scheduler.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Started Getty on tty1.
Oct 02 07:12:03 np0005465596.novalocal rsyslogd[1010]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1010" x-info="https://www.rsyslog.com"] start
Oct 02 07:12:03 np0005465596.novalocal rsyslogd[1010]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct 02 07:12:03 np0005465596.novalocal crond[1013]: (CRON) STARTUP (1.5.7)
Oct 02 07:12:03 np0005465596.novalocal crond[1013]: (CRON) INFO (Syslog will be used instead of sendmail.)
Oct 02 07:12:03 np0005465596.novalocal crond[1013]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 48% if used.)
Oct 02 07:12:03 np0005465596.novalocal crond[1013]: (CRON) INFO (running with inotify support)
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Started Serial Getty on ttyS0.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Reached target Login Prompts.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Started System Logging Service.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Reached target Multi-User System.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Oct 02 07:12:03 np0005465596.novalocal rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 02 07:12:03 np0005465596.novalocal cloud-init[1022]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Thu, 02 Oct 2025 07:12:03 +0000. Up 11.28 seconds.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Oct 02 07:12:03 np0005465596.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Oct 02 07:12:04 np0005465596.novalocal cloud-init[1026]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Thu, 02 Oct 2025 07:12:04 +0000. Up 11.69 seconds.
Oct 02 07:12:04 np0005465596.novalocal cloud-init[1028]: #############################################################
Oct 02 07:12:04 np0005465596.novalocal cloud-init[1029]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct 02 07:12:04 np0005465596.novalocal cloud-init[1031]: 256 SHA256:aXd/xtrhPXNrk1uRhy6jBGYV9I8IWDxUA8XfDJlZVKU root@np0005465596.novalocal (ECDSA)
Oct 02 07:12:04 np0005465596.novalocal cloud-init[1033]: 256 SHA256:BFecLdyCJnlWsogf0t6aR0IXc5tFXVHgBf/hBbde1Ag root@np0005465596.novalocal (ED25519)
Oct 02 07:12:04 np0005465596.novalocal cloud-init[1035]: 3072 SHA256:V+BUwwONX32CV8YmExxR0xpVlXagPFRplgWKvoo75Us root@np0005465596.novalocal (RSA)
Oct 02 07:12:04 np0005465596.novalocal cloud-init[1036]: -----END SSH HOST KEY FINGERPRINTS-----
Oct 02 07:12:04 np0005465596.novalocal cloud-init[1037]: #############################################################
Oct 02 07:12:04 np0005465596.novalocal cloud-init[1026]: Cloud-init v. 24.4-7.el9 finished at Thu, 02 Oct 2025 07:12:04 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.89 seconds
Oct 02 07:12:04 np0005465596.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Oct 02 07:12:04 np0005465596.novalocal systemd[1]: Reached target Cloud-init target.
Oct 02 07:12:04 np0005465596.novalocal systemd[1]: Startup finished in 1.659s (kernel) + 2.762s (initrd) + 7.542s (userspace) = 11.964s.
Oct 02 07:12:04 np0005465596.novalocal sshd-session[1043]: Unable to negotiate with 38.102.83.114 port 44460: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Oct 02 07:12:04 np0005465596.novalocal sshd-session[1045]: Connection closed by 38.102.83.114 port 44472 [preauth]
Oct 02 07:12:04 np0005465596.novalocal sshd-session[1047]: Unable to negotiate with 38.102.83.114 port 44484: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Oct 02 07:12:04 np0005465596.novalocal sshd-session[1049]: Unable to negotiate with 38.102.83.114 port 44490: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Oct 02 07:12:04 np0005465596.novalocal sshd-session[1041]: Connection closed by 38.102.83.114 port 44448 [preauth]
Oct 02 07:12:04 np0005465596.novalocal sshd-session[1055]: Unable to negotiate with 38.102.83.114 port 44512: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Oct 02 07:12:04 np0005465596.novalocal sshd-session[1057]: Unable to negotiate with 38.102.83.114 port 44518: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Oct 02 07:12:04 np0005465596.novalocal sshd-session[1051]: Connection closed by 38.102.83.114 port 44502 [preauth]
Oct 02 07:12:04 np0005465596.novalocal sshd-session[1053]: Connection closed by 38.102.83.114 port 44508 [preauth]
Oct 02 07:12:06 np0005465596.novalocal chronyd[835]: Selected source 216.128.178.20 (2.centos.pool.ntp.org)
Oct 02 07:12:06 np0005465596.novalocal chronyd[835]: System clock TAI offset set to 37 seconds
Oct 02 07:12:09 np0005465596.novalocal irqbalance[823]: Cannot change IRQ 25 affinity: Operation not permitted
Oct 02 07:12:09 np0005465596.novalocal irqbalance[823]: IRQ 25 affinity is now unmanaged
Oct 02 07:12:09 np0005465596.novalocal irqbalance[823]: Cannot change IRQ 31 affinity: Operation not permitted
Oct 02 07:12:09 np0005465596.novalocal irqbalance[823]: IRQ 31 affinity is now unmanaged
Oct 02 07:12:09 np0005465596.novalocal irqbalance[823]: Cannot change IRQ 28 affinity: Operation not permitted
Oct 02 07:12:09 np0005465596.novalocal irqbalance[823]: IRQ 28 affinity is now unmanaged
Oct 02 07:12:09 np0005465596.novalocal irqbalance[823]: Cannot change IRQ 32 affinity: Operation not permitted
Oct 02 07:12:09 np0005465596.novalocal irqbalance[823]: IRQ 32 affinity is now unmanaged
Oct 02 07:12:09 np0005465596.novalocal irqbalance[823]: Cannot change IRQ 30 affinity: Operation not permitted
Oct 02 07:12:09 np0005465596.novalocal irqbalance[823]: IRQ 30 affinity is now unmanaged
Oct 02 07:12:09 np0005465596.novalocal irqbalance[823]: Cannot change IRQ 29 affinity: Operation not permitted
Oct 02 07:12:09 np0005465596.novalocal irqbalance[823]: IRQ 29 affinity is now unmanaged
Oct 02 07:12:11 np0005465596.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 02 07:12:25 np0005465596.novalocal sshd-session[1059]: Accepted publickey for zuul from 38.102.83.114 port 54908 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Oct 02 07:12:25 np0005465596.novalocal systemd[1]: Created slice User Slice of UID 1000.
Oct 02 07:12:25 np0005465596.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct 02 07:12:25 np0005465596.novalocal systemd-logind[827]: New session 1 of user zuul.
Oct 02 07:12:25 np0005465596.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct 02 07:12:25 np0005465596.novalocal systemd[1]: Starting User Manager for UID 1000...
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Queued start job for default target Main User Target.
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Created slice User Application Slice.
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Started Daily Cleanup of User's Temporary Directories.
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Reached target Paths.
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Reached target Timers.
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Starting D-Bus User Message Bus Socket...
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Starting Create User's Volatile Files and Directories...
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Listening on D-Bus User Message Bus Socket.
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Reached target Sockets.
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Finished Create User's Volatile Files and Directories.
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Reached target Basic System.
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Reached target Main User Target.
Oct 02 07:12:25 np0005465596.novalocal systemd[1063]: Startup finished in 145ms.
Oct 02 07:12:25 np0005465596.novalocal systemd[1]: Started User Manager for UID 1000.
Oct 02 07:12:25 np0005465596.novalocal systemd[1]: Started Session 1 of User zuul.
Oct 02 07:12:25 np0005465596.novalocal sshd-session[1059]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:12:26 np0005465596.novalocal python3[1145]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:12:28 np0005465596.novalocal python3[1173]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:12:30 np0005465596.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 02 07:12:35 np0005465596.novalocal python3[1233]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:12:36 np0005465596.novalocal python3[1273]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct 02 07:12:38 np0005465596.novalocal python3[1299]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyaKkK41Mbak9gTlgiig0nqof017BwMzK/pgooJIf2EvgWa+uilSK+KtvJlSmYcKFmT7gcd/oJ0wZiiEtbNzKGwOVp9T59lCdL18ywYvfac8l+coAeOubTJVwEGbbX2dKs/Fb0DBGAjXnPnXal/36x9vFDL/iBr09MDsVcErcx8aaU5feE65HCV19GGbSAIW9DT508cJsUB9ybcpkvisV5mzD3WI60bnEpqCMu8lsUvh/I6RdhG0Ml6hLaEt+L1s5Q4L1AXbOzQA+ahm0joqozYpb2X9S1lHP38TtfCP9id2dvPAgaqUTRd8132BNMd/hD3XQ9OkyHA8qvqbu3A8nN7ipRYNnugdp3do5JbwtlbK6yNfrX5Uv9vqZE7cYdABp7P8+y05I0++z4r92FesjCi2of0QP3flrN7f1JKvJgHbtDZbV/HzxWiiZyDk0d1Fc/7aex8vp40wJElXIUZksPJu4279nIMTTzc5YZYhX+wu1oZJ+mS/jR3iiSMtwcjNE= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:39 np0005465596.novalocal python3[1323]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:12:39 np0005465596.novalocal python3[1422]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:12:40 np0005465596.novalocal python3[1493]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759389159.4066913-229-273414197642137/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=9c98b360186741189daa1bd2463033df_id_rsa follow=False checksum=cb968992e53420aaeeeac2a411e82f19e866f7b2 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:12:40 np0005465596.novalocal python3[1616]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:12:41 np0005465596.novalocal python3[1687]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759389160.4659731-273-66354322535215/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=9c98b360186741189daa1bd2463033df_id_rsa.pub follow=False checksum=bde31fd9c24f5c54ae6f9f24eca731b3eb84eb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:12:42 np0005465596.novalocal python3[1735]: ansible-ping Invoked with data=pong
Oct 02 07:12:43 np0005465596.novalocal python3[1759]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:12:45 np0005465596.novalocal python3[1817]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct 02 07:12:46 np0005465596.novalocal python3[1849]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:12:46 np0005465596.novalocal python3[1873]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:12:46 np0005465596.novalocal python3[1897]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:12:47 np0005465596.novalocal python3[1921]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:12:47 np0005465596.novalocal python3[1945]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:12:47 np0005465596.novalocal python3[1969]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:12:49 np0005465596.novalocal sudo[1993]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anpvwmvcdttmpxyzshuheiobckxdxduo ; /usr/bin/python3'
Oct 02 07:12:49 np0005465596.novalocal sudo[1993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:12:49 np0005465596.novalocal python3[1995]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:12:49 np0005465596.novalocal sudo[1993]: pam_unix(sudo:session): session closed for user root
Oct 02 07:12:49 np0005465596.novalocal sudo[2071]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbygexiynggtvafxbzfyoaaegfmapvrf ; /usr/bin/python3'
Oct 02 07:12:49 np0005465596.novalocal sudo[2071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:12:50 np0005465596.novalocal python3[2073]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:12:50 np0005465596.novalocal sudo[2071]: pam_unix(sudo:session): session closed for user root
Oct 02 07:12:50 np0005465596.novalocal sudo[2144]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujavcqxfjmonvxxhqmsfhkgcnptopjqm ; /usr/bin/python3'
Oct 02 07:12:50 np0005465596.novalocal sudo[2144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:12:50 np0005465596.novalocal python3[2146]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759389169.524413-26-62406795004265/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:12:50 np0005465596.novalocal sudo[2144]: pam_unix(sudo:session): session closed for user root
Oct 02 07:12:51 np0005465596.novalocal python3[2194]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:51 np0005465596.novalocal python3[2218]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:51 np0005465596.novalocal python3[2242]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:52 np0005465596.novalocal python3[2266]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:52 np0005465596.novalocal python3[2290]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:52 np0005465596.novalocal python3[2314]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:52 np0005465596.novalocal python3[2338]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:53 np0005465596.novalocal python3[2362]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:53 np0005465596.novalocal python3[2386]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:53 np0005465596.novalocal python3[2410]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:54 np0005465596.novalocal python3[2434]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:54 np0005465596.novalocal python3[2458]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:54 np0005465596.novalocal python3[2482]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:54 np0005465596.novalocal python3[2506]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:55 np0005465596.novalocal python3[2530]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:55 np0005465596.novalocal python3[2554]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:55 np0005465596.novalocal python3[2578]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:56 np0005465596.novalocal python3[2602]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:56 np0005465596.novalocal python3[2626]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:56 np0005465596.novalocal python3[2650]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:57 np0005465596.novalocal python3[2674]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:57 np0005465596.novalocal python3[2698]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:57 np0005465596.novalocal python3[2722]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:57 np0005465596.novalocal python3[2746]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:58 np0005465596.novalocal python3[2770]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:58 np0005465596.novalocal python3[2794]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:12:59 np0005465596.novalocal irqbalance[823]: Cannot change IRQ 26 affinity: Operation not permitted
Oct 02 07:12:59 np0005465596.novalocal irqbalance[823]: IRQ 26 affinity is now unmanaged
Oct 02 07:13:00 np0005465596.novalocal sudo[2818]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpsgdnddgufoupfsycqtqszolyxpuyqh ; /usr/bin/python3'
Oct 02 07:13:00 np0005465596.novalocal sudo[2818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:13:00 np0005465596.novalocal python3[2820]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 02 07:13:00 np0005465596.novalocal systemd[1]: Starting Time & Date Service...
Oct 02 07:13:00 np0005465596.novalocal systemd[1]: Started Time & Date Service.
Oct 02 07:13:00 np0005465596.novalocal systemd-timedated[2822]: Changed time zone to 'UTC' (UTC).
Oct 02 07:13:00 np0005465596.novalocal sudo[2818]: pam_unix(sudo:session): session closed for user root
Oct 02 07:13:02 np0005465596.novalocal sudo[2849]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cklnrdnshdxsccrmoeqteugeesqeuhkq ; /usr/bin/python3'
Oct 02 07:13:02 np0005465596.novalocal sudo[2849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:13:02 np0005465596.novalocal python3[2851]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:13:02 np0005465596.novalocal sudo[2849]: pam_unix(sudo:session): session closed for user root
Oct 02 07:13:02 np0005465596.novalocal python3[2927]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:13:03 np0005465596.novalocal python3[2998]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759389182.4888391-202-5263586590958/source _original_basename=tmpjkknb6ec follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:13:03 np0005465596.novalocal python3[3098]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:13:04 np0005465596.novalocal python3[3169]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759389183.4572344-242-11247529508993/source _original_basename=tmp3ewth_kh follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:13:04 np0005465596.novalocal sudo[3269]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-songqiayhigznjrlqtaacjtzlgxmqdje ; /usr/bin/python3'
Oct 02 07:13:04 np0005465596.novalocal sudo[3269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:13:04 np0005465596.novalocal python3[3271]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:13:05 np0005465596.novalocal sudo[3269]: pam_unix(sudo:session): session closed for user root
Oct 02 07:13:05 np0005465596.novalocal sudo[3342]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmynctqqdwxdycxiesycpubizzyftaol ; /usr/bin/python3'
Oct 02 07:13:05 np0005465596.novalocal sudo[3342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:13:05 np0005465596.novalocal python3[3344]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759389184.65597-306-180923503925226/source _original_basename=tmp57vmoik0 follow=False checksum=f33247efe46b7b4ce18ab2651b4190ff23abc108 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:13:05 np0005465596.novalocal sudo[3342]: pam_unix(sudo:session): session closed for user root
Oct 02 07:13:05 np0005465596.novalocal python3[3392]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:13:06 np0005465596.novalocal python3[3418]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:13:06 np0005465596.novalocal sudo[3496]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdehtjasrvuaeuiznftxxbewqlivhaho ; /usr/bin/python3'
Oct 02 07:13:06 np0005465596.novalocal sudo[3496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:13:06 np0005465596.novalocal python3[3498]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:13:06 np0005465596.novalocal sudo[3496]: pam_unix(sudo:session): session closed for user root
Oct 02 07:13:07 np0005465596.novalocal sudo[3569]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsswjjgcdsfohpikuvgmwkgryxssensz ; /usr/bin/python3'
Oct 02 07:13:07 np0005465596.novalocal sudo[3569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:13:07 np0005465596.novalocal python3[3571]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759389186.4049556-362-80593397925871/source _original_basename=tmp9ib3dh0f follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:13:07 np0005465596.novalocal sudo[3569]: pam_unix(sudo:session): session closed for user root
Oct 02 07:13:07 np0005465596.novalocal sudo[3620]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvdqyymazelmxlvdsldeqlkswlhpzgde ; /usr/bin/python3'
Oct 02 07:13:07 np0005465596.novalocal sudo[3620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:13:07 np0005465596.novalocal python3[3622]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-6355-8009-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:13:07 np0005465596.novalocal sudo[3620]: pam_unix(sudo:session): session closed for user root
Oct 02 07:13:08 np0005465596.novalocal python3[3650]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-6355-8009-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct 02 07:13:09 np0005465596.novalocal python3[3678]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:13:12 np0005465596.novalocal chronyd[835]: Selected source 167.160.187.179 (2.centos.pool.ntp.org)
Oct 02 07:13:26 np0005465596.novalocal sudo[3702]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeehpzsmfdjfjxzbucxtvpsiisbzdqth ; /usr/bin/python3'
Oct 02 07:13:26 np0005465596.novalocal sudo[3702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:13:26 np0005465596.novalocal python3[3704]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:13:26 np0005465596.novalocal sudo[3702]: pam_unix(sudo:session): session closed for user root
Oct 02 07:13:30 np0005465596.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 02 07:14:07 np0005465596.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 02 07:14:07 np0005465596.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct 02 07:14:07 np0005465596.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct 02 07:14:07 np0005465596.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct 02 07:14:07 np0005465596.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct 02 07:14:07 np0005465596.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct 02 07:14:07 np0005465596.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Oct 02 07:14:07 np0005465596.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct 02 07:14:07 np0005465596.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct 02 07:14:07 np0005465596.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct 02 07:14:07 np0005465596.novalocal NetworkManager[863]: <info>  [1759389247.3412] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 02 07:14:07 np0005465596.novalocal systemd-udevd[3708]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 07:14:07 np0005465596.novalocal NetworkManager[863]: <info>  [1759389247.3633] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:14:07 np0005465596.novalocal NetworkManager[863]: <info>  [1759389247.3684] settings: (eth1): created default wired connection 'Wired connection 1'
Oct 02 07:14:07 np0005465596.novalocal NetworkManager[863]: <info>  [1759389247.3691] device (eth1): carrier: link connected
Oct 02 07:14:07 np0005465596.novalocal NetworkManager[863]: <info>  [1759389247.3694] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 02 07:14:07 np0005465596.novalocal NetworkManager[863]: <info>  [1759389247.3704] policy: auto-activating connection 'Wired connection 1' (40b6ce4d-1768-3626-b020-83475d2a4193)
Oct 02 07:14:07 np0005465596.novalocal NetworkManager[863]: <info>  [1759389247.3710] device (eth1): Activation: starting connection 'Wired connection 1' (40b6ce4d-1768-3626-b020-83475d2a4193)
Oct 02 07:14:07 np0005465596.novalocal NetworkManager[863]: <info>  [1759389247.3712] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:14:07 np0005465596.novalocal NetworkManager[863]: <info>  [1759389247.3717] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:14:07 np0005465596.novalocal NetworkManager[863]: <info>  [1759389247.3724] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:14:07 np0005465596.novalocal NetworkManager[863]: <info>  [1759389247.3732] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 02 07:14:08 np0005465596.novalocal python3[3734]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-5c25-7743-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:14:15 np0005465596.novalocal sudo[3812]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fybixzjlnsvyzrkzglzugezjhmiapsme ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 02 07:14:15 np0005465596.novalocal sudo[3812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:14:15 np0005465596.novalocal python3[3814]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:14:15 np0005465596.novalocal sudo[3812]: pam_unix(sudo:session): session closed for user root
Oct 02 07:14:15 np0005465596.novalocal sudo[3885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iexohdzpzhjdcjtgkatouhhzcpenxcxb ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 02 07:14:15 np0005465596.novalocal sudo[3885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:14:15 np0005465596.novalocal python3[3887]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759389255.1121225-103-173693899389762/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=2419f6378cc59a09f8bf771036236ebbf3a30ffa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:14:15 np0005465596.novalocal sudo[3885]: pam_unix(sudo:session): session closed for user root
Oct 02 07:14:16 np0005465596.novalocal sudo[3935]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvnypqegoeatwlywzmwetnzrdmlfwdky ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 02 07:14:16 np0005465596.novalocal sudo[3935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:14:16 np0005465596.novalocal python3[3937]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:14:16 np0005465596.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 02 07:14:16 np0005465596.novalocal systemd[1]: Stopped Network Manager Wait Online.
Oct 02 07:14:16 np0005465596.novalocal systemd[1]: Stopping Network Manager Wait Online...
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[863]: <info>  [1759389256.7657] caught SIGTERM, shutting down normally.
Oct 02 07:14:16 np0005465596.novalocal systemd[1]: Stopping Network Manager...
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[863]: <info>  [1759389256.7669] dhcp4 (eth0): canceled DHCP transaction
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[863]: <info>  [1759389256.7670] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[863]: <info>  [1759389256.7670] dhcp4 (eth0): state changed no lease
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[863]: <info>  [1759389256.7674] manager: NetworkManager state is now CONNECTING
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[863]: <info>  [1759389256.7802] dhcp4 (eth1): canceled DHCP transaction
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[863]: <info>  [1759389256.7802] dhcp4 (eth1): state changed no lease
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[863]: <info>  [1759389256.7857] exiting (success)
Oct 02 07:14:16 np0005465596.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 02 07:14:16 np0005465596.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 02 07:14:16 np0005465596.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 02 07:14:16 np0005465596.novalocal systemd[1]: Stopped Network Manager.
Oct 02 07:14:16 np0005465596.novalocal systemd[1]: Starting Network Manager...
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.8591] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:f022b9c3-1b0c-4cc3-9fcc-f643153c9b0a)
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.8596] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.8664] manager[0x5648f8cf4070]: monitoring kernel firmware directory '/lib/firmware'.
Oct 02 07:14:16 np0005465596.novalocal systemd[1]: Starting Hostname Service...
Oct 02 07:14:16 np0005465596.novalocal systemd[1]: Started Hostname Service.
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9748] hostname: hostname: using hostnamed
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9750] hostname: static hostname changed from (none) to "np0005465596.novalocal"
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9759] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9769] manager[0x5648f8cf4070]: rfkill: Wi-Fi hardware radio set enabled
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9773] manager[0x5648f8cf4070]: rfkill: WWAN hardware radio set enabled
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9813] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9814] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9816] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9817] manager: Networking is enabled by state file
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9821] settings: Loaded settings plugin: keyfile (internal)
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9827] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9860] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9876] dhcp: init: Using DHCP client 'internal'
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9880] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9888] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9900] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9913] device (lo): Activation: starting connection 'lo' (77cf1f15-4a84-4c7c-ae0f-bf80f6a18c78)
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9925] device (eth0): carrier: link connected
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9933] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9942] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9944] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9955] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9968] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9980] device (eth1): carrier: link connected
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9988] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9997] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (40b6ce4d-1768-3626-b020-83475d2a4193) (indicated)
Oct 02 07:14:16 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389256.9998] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0009] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0020] device (eth1): Activation: starting connection 'Wired connection 1' (40b6ce4d-1768-3626-b020-83475d2a4193)
Oct 02 07:14:17 np0005465596.novalocal systemd[1]: Started Network Manager.
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0033] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0042] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0046] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0048] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0051] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0054] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0057] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0059] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0063] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0073] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0077] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0088] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0092] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0110] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0117] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0125] device (lo): Activation: successful, device activated.
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0134] dhcp4 (eth0): state changed new lease, address=38.102.83.73
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0143] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 02 07:14:17 np0005465596.novalocal systemd[1]: Starting Network Manager Wait Online...
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0223] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0269] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0273] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0278] manager: NetworkManager state is now CONNECTED_SITE
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0284] device (eth0): Activation: successful, device activated.
Oct 02 07:14:17 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389257.0292] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 02 07:14:17 np0005465596.novalocal sudo[3935]: pam_unix(sudo:session): session closed for user root
Oct 02 07:14:17 np0005465596.novalocal python3[4023]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-5c25-7743-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:14:27 np0005465596.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 02 07:14:46 np0005465596.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 02 07:14:49 np0005465596.novalocal systemd[1063]: Starting Mark boot as successful...
Oct 02 07:14:49 np0005465596.novalocal systemd[1063]: Finished Mark boot as successful.
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.3533] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 02 07:15:02 np0005465596.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 02 07:15:02 np0005465596.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.3886] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.3890] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.3906] device (eth1): Activation: successful, device activated.
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.3916] manager: startup complete
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.3919] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <warn>  [1759389302.3927] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.3940] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct 02 07:15:02 np0005465596.novalocal systemd[1]: Finished Network Manager Wait Online.
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.4002] dhcp4 (eth1): canceled DHCP transaction
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.4003] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.4003] dhcp4 (eth1): state changed no lease
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.4028] policy: auto-activating connection 'ci-private-network' (cfce9517-0ffb-5f79-8907-2e072b5156ab)
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.4037] device (eth1): Activation: starting connection 'ci-private-network' (cfce9517-0ffb-5f79-8907-2e072b5156ab)
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.4039] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.4044] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.4056] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.4069] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.4126] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.4129] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:15:02 np0005465596.novalocal NetworkManager[3947]: <info>  [1759389302.4140] device (eth1): Activation: successful, device activated.
Oct 02 07:15:12 np0005465596.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 02 07:15:17 np0005465596.novalocal sshd-session[1072]: Received disconnect from 38.102.83.114 port 54908:11: disconnected by user
Oct 02 07:15:17 np0005465596.novalocal sshd-session[1072]: Disconnected from user zuul 38.102.83.114 port 54908
Oct 02 07:15:17 np0005465596.novalocal sshd-session[1059]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:15:17 np0005465596.novalocal systemd-logind[827]: Session 1 logged out. Waiting for processes to exit.
Oct 02 07:15:44 np0005465596.novalocal sshd-session[4052]: Accepted publickey for zuul from 38.102.83.114 port 51488 ssh2: RSA SHA256:keY65lHhwVneQ22wb0gGakhmLa8aDARUTSvk0mdY3D0
Oct 02 07:15:44 np0005465596.novalocal systemd-logind[827]: New session 3 of user zuul.
Oct 02 07:15:44 np0005465596.novalocal systemd[1]: Started Session 3 of User zuul.
Oct 02 07:15:44 np0005465596.novalocal sshd-session[4052]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:15:44 np0005465596.novalocal sudo[4131]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqbcumyuwcziwtzacjajrufscqpxmfyw ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 02 07:15:44 np0005465596.novalocal sudo[4131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:15:44 np0005465596.novalocal python3[4133]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:15:44 np0005465596.novalocal sudo[4131]: pam_unix(sudo:session): session closed for user root
Oct 02 07:15:44 np0005465596.novalocal sudo[4204]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxqdgqnswvxycocvdjxadmdidhixqqpd ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 02 07:15:44 np0005465596.novalocal sudo[4204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:15:44 np0005465596.novalocal python3[4206]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759389344.1566055-312-47546432939219/source _original_basename=tmp051na_1c follow=False checksum=a0027086163ccdec79a407ddad0a2fbde34bc049 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:15:45 np0005465596.novalocal sudo[4204]: pam_unix(sudo:session): session closed for user root
Oct 02 07:15:48 np0005465596.novalocal sshd-session[4055]: Connection closed by 38.102.83.114 port 51488
Oct 02 07:15:48 np0005465596.novalocal sshd-session[4052]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:15:48 np0005465596.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Oct 02 07:15:48 np0005465596.novalocal systemd-logind[827]: Session 3 logged out. Waiting for processes to exit.
Oct 02 07:15:48 np0005465596.novalocal systemd-logind[827]: Removed session 3.
Oct 02 07:16:39 np0005465596.novalocal sshd-session[4231]: Received disconnect from 193.46.255.217 port 47868:11:  [preauth]
Oct 02 07:16:39 np0005465596.novalocal sshd-session[4231]: Disconnected from authenticating user root 193.46.255.217 port 47868 [preauth]
Oct 02 07:17:49 np0005465596.novalocal systemd[1063]: Created slice User Background Tasks Slice.
Oct 02 07:17:49 np0005465596.novalocal systemd[1063]: Starting Cleanup of User's Temporary Files and Directories...
Oct 02 07:17:49 np0005465596.novalocal systemd[1063]: Finished Cleanup of User's Temporary Files and Directories.
Oct 02 07:19:56 np0005465596.novalocal sshd-session[4237]: Invalid user ubnt from 213.55.79.195 port 54606
Oct 02 07:19:56 np0005465596.novalocal sshd-session[4237]: Connection closed by invalid user ubnt 213.55.79.195 port 54606 [preauth]
Oct 02 07:21:14 np0005465596.novalocal sshd-session[4240]: Accepted publickey for zuul from 38.102.83.114 port 37856 ssh2: RSA SHA256:keY65lHhwVneQ22wb0gGakhmLa8aDARUTSvk0mdY3D0
Oct 02 07:21:14 np0005465596.novalocal systemd-logind[827]: New session 4 of user zuul.
Oct 02 07:21:14 np0005465596.novalocal systemd[1]: Started Session 4 of User zuul.
Oct 02 07:21:14 np0005465596.novalocal sshd-session[4240]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:21:14 np0005465596.novalocal sudo[4267]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-begdeeqazhsmjoroyiyptihlvyohyxnq ; /usr/bin/python3'
Oct 02 07:21:14 np0005465596.novalocal sudo[4267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:14 np0005465596.novalocal python3[4269]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-85a8-e015-000000001cf1-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:21:14 np0005465596.novalocal sudo[4267]: pam_unix(sudo:session): session closed for user root
Oct 02 07:21:15 np0005465596.novalocal sudo[4296]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpbophjstxithzoeeofsngavtsgwctii ; /usr/bin/python3'
Oct 02 07:21:15 np0005465596.novalocal sudo[4296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:15 np0005465596.novalocal python3[4298]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:21:15 np0005465596.novalocal sudo[4296]: pam_unix(sudo:session): session closed for user root
Oct 02 07:21:15 np0005465596.novalocal sudo[4322]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdffgskqtnthnzrqcktweofkywwcapkx ; /usr/bin/python3'
Oct 02 07:21:15 np0005465596.novalocal sudo[4322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:15 np0005465596.novalocal python3[4324]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:21:15 np0005465596.novalocal sudo[4322]: pam_unix(sudo:session): session closed for user root
Oct 02 07:21:15 np0005465596.novalocal sudo[4348]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmrjtnejwqpyqamxuyyndvmmaaxqespo ; /usr/bin/python3'
Oct 02 07:21:15 np0005465596.novalocal sudo[4348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:16 np0005465596.novalocal python3[4350]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:21:16 np0005465596.novalocal sudo[4348]: pam_unix(sudo:session): session closed for user root
Oct 02 07:21:16 np0005465596.novalocal sudo[4374]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kivysbmnwjimwalbwumrfpikzjntrody ; /usr/bin/python3'
Oct 02 07:21:16 np0005465596.novalocal sudo[4374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:16 np0005465596.novalocal python3[4376]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:21:16 np0005465596.novalocal sudo[4374]: pam_unix(sudo:session): session closed for user root
Oct 02 07:21:16 np0005465596.novalocal sudo[4400]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zenmfnldilwjodixuhjpyfsysegyxttn ; /usr/bin/python3'
Oct 02 07:21:16 np0005465596.novalocal sudo[4400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:16 np0005465596.novalocal python3[4402]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:21:16 np0005465596.novalocal python3[4402]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct 02 07:21:16 np0005465596.novalocal sudo[4400]: pam_unix(sudo:session): session closed for user root
Oct 02 07:21:17 np0005465596.novalocal sudo[4426]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pukxslkqbdzcfwaeltnvvlgiasipmgkx ; /usr/bin/python3'
Oct 02 07:21:17 np0005465596.novalocal sudo[4426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:17 np0005465596.novalocal python3[4428]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 07:21:17 np0005465596.novalocal systemd[1]: Reloading.
Oct 02 07:21:17 np0005465596.novalocal systemd-rc-local-generator[4449]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:21:18 np0005465596.novalocal sudo[4426]: pam_unix(sudo:session): session closed for user root
Oct 02 07:21:19 np0005465596.novalocal sudo[4481]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etykggwjrhfwmpzjdsujcfqtxzlwsypn ; /usr/bin/python3'
Oct 02 07:21:19 np0005465596.novalocal sudo[4481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:19 np0005465596.novalocal python3[4483]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct 02 07:21:19 np0005465596.novalocal sudo[4481]: pam_unix(sudo:session): session closed for user root
Oct 02 07:21:19 np0005465596.novalocal sudo[4507]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckmqowzwjlywilveihadfsltioocehzo ; /usr/bin/python3'
Oct 02 07:21:19 np0005465596.novalocal sudo[4507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:19 np0005465596.novalocal python3[4509]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:21:19 np0005465596.novalocal sudo[4507]: pam_unix(sudo:session): session closed for user root
Oct 02 07:21:19 np0005465596.novalocal sudo[4535]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywfbyszohimjhxmqgflnelqjsvyxuebp ; /usr/bin/python3'
Oct 02 07:21:19 np0005465596.novalocal sudo[4535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:20 np0005465596.novalocal python3[4537]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:21:20 np0005465596.novalocal sudo[4535]: pam_unix(sudo:session): session closed for user root
Oct 02 07:21:20 np0005465596.novalocal sudo[4563]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dohsiknzaelcmyrpldpiqhtctnxcsodv ; /usr/bin/python3'
Oct 02 07:21:20 np0005465596.novalocal sudo[4563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:20 np0005465596.novalocal python3[4565]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:21:20 np0005465596.novalocal sudo[4563]: pam_unix(sudo:session): session closed for user root
Oct 02 07:21:20 np0005465596.novalocal sudo[4591]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzkyduadxofmnizjvlilqpbfemsgwhud ; /usr/bin/python3'
Oct 02 07:21:20 np0005465596.novalocal sudo[4591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:20 np0005465596.novalocal python3[4593]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:21:20 np0005465596.novalocal sudo[4591]: pam_unix(sudo:session): session closed for user root
Oct 02 07:21:21 np0005465596.novalocal python3[4620]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-85a8-e015-000000001cf7-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:21:21 np0005465596.novalocal python3[4650]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:21:24 np0005465596.novalocal sshd-session[4243]: Connection closed by 38.102.83.114 port 37856
Oct 02 07:21:24 np0005465596.novalocal sshd-session[4240]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:21:24 np0005465596.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Oct 02 07:21:24 np0005465596.novalocal systemd[1]: session-4.scope: Consumed 3.832s CPU time.
Oct 02 07:21:24 np0005465596.novalocal systemd-logind[827]: Session 4 logged out. Waiting for processes to exit.
Oct 02 07:21:24 np0005465596.novalocal systemd-logind[827]: Removed session 4.
Oct 02 07:21:26 np0005465596.novalocal sshd-session[4658]: Accepted publickey for zuul from 38.102.83.114 port 35672 ssh2: RSA SHA256:keY65lHhwVneQ22wb0gGakhmLa8aDARUTSvk0mdY3D0
Oct 02 07:21:26 np0005465596.novalocal systemd-logind[827]: New session 5 of user zuul.
Oct 02 07:21:26 np0005465596.novalocal systemd[1]: Started Session 5 of User zuul.
Oct 02 07:21:26 np0005465596.novalocal sshd-session[4658]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:21:26 np0005465596.novalocal sudo[4685]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwrbdqagfipfwfjhjygsybonhdhdnsso ; /usr/bin/python3'
Oct 02 07:21:26 np0005465596.novalocal sudo[4685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:21:26 np0005465596.novalocal python3[4687]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 02 07:21:29 np0005465596.novalocal irqbalance[823]: Cannot change IRQ 27 affinity: Operation not permitted
Oct 02 07:21:29 np0005465596.novalocal irqbalance[823]: IRQ 27 affinity is now unmanaged
Oct 02 07:21:42 np0005465596.novalocal kernel: SELinux:  Converting 363 SID table entries...
Oct 02 07:21:42 np0005465596.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 02 07:21:42 np0005465596.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 02 07:21:42 np0005465596.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 02 07:21:42 np0005465596.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 02 07:21:42 np0005465596.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 02 07:21:42 np0005465596.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 02 07:21:42 np0005465596.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 02 07:21:50 np0005465596.novalocal kernel: SELinux:  Converting 363 SID table entries...
Oct 02 07:21:50 np0005465596.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 02 07:21:50 np0005465596.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 02 07:21:50 np0005465596.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 02 07:21:50 np0005465596.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 02 07:21:50 np0005465596.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 02 07:21:50 np0005465596.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 02 07:21:50 np0005465596.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 02 07:21:59 np0005465596.novalocal kernel: SELinux:  Converting 363 SID table entries...
Oct 02 07:21:59 np0005465596.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 02 07:21:59 np0005465596.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 02 07:21:59 np0005465596.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 02 07:21:59 np0005465596.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 02 07:21:59 np0005465596.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 02 07:21:59 np0005465596.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 02 07:21:59 np0005465596.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 02 07:22:01 np0005465596.novalocal setsebool[4753]: The virt_use_nfs policy boolean was changed to 1 by root
Oct 02 07:22:01 np0005465596.novalocal setsebool[4753]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct 02 07:22:12 np0005465596.novalocal kernel: SELinux:  Converting 366 SID table entries...
Oct 02 07:22:12 np0005465596.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 02 07:22:12 np0005465596.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 02 07:22:12 np0005465596.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 02 07:22:12 np0005465596.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 02 07:22:12 np0005465596.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 02 07:22:12 np0005465596.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 02 07:22:12 np0005465596.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 02 07:22:30 np0005465596.novalocal dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 02 07:22:30 np0005465596.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 02 07:22:30 np0005465596.novalocal systemd[1]: Starting man-db-cache-update.service...
Oct 02 07:22:30 np0005465596.novalocal systemd[1]: Reloading.
Oct 02 07:22:30 np0005465596.novalocal systemd-rc-local-generator[5510]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:22:30 np0005465596.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Oct 02 07:22:32 np0005465596.novalocal systemd[1]: Starting PackageKit Daemon...
Oct 02 07:22:32 np0005465596.novalocal PackageKit[6175]: daemon start
Oct 02 07:22:32 np0005465596.novalocal systemd[1]: Starting Authorization Manager...
Oct 02 07:22:32 np0005465596.novalocal polkitd[6272]: Started polkitd version 0.117
Oct 02 07:22:32 np0005465596.novalocal polkitd[6272]: Loading rules from directory /etc/polkit-1/rules.d
Oct 02 07:22:32 np0005465596.novalocal polkitd[6272]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 02 07:22:32 np0005465596.novalocal polkitd[6272]: Finished loading, compiling and executing 3 rules
Oct 02 07:22:32 np0005465596.novalocal polkitd[6272]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Oct 02 07:22:32 np0005465596.novalocal systemd[1]: Started Authorization Manager.
Oct 02 07:22:32 np0005465596.novalocal systemd[1]: Started PackageKit Daemon.
Oct 02 07:22:32 np0005465596.novalocal sudo[4685]: pam_unix(sudo:session): session closed for user root
Oct 02 07:22:33 np0005465596.novalocal python3[7071]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-cac3-292a-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:22:34 np0005465596.novalocal kernel: evm: overlay not supported
Oct 02 07:22:34 np0005465596.novalocal systemd[1063]: Starting D-Bus User Message Bus...
Oct 02 07:22:34 np0005465596.novalocal dbus-broker-launch[8133]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct 02 07:22:34 np0005465596.novalocal dbus-broker-launch[8133]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct 02 07:22:34 np0005465596.novalocal systemd[1063]: Started D-Bus User Message Bus.
Oct 02 07:22:34 np0005465596.novalocal dbus-broker-lau[8133]: Ready
Oct 02 07:22:34 np0005465596.novalocal systemd[1063]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 02 07:22:34 np0005465596.novalocal systemd[1063]: Created slice Slice /user.
Oct 02 07:22:34 np0005465596.novalocal systemd[1063]: podman-8004.scope: unit configures an IP firewall, but not running as root.
Oct 02 07:22:34 np0005465596.novalocal systemd[1063]: (This warning is only shown for the first unit using IP firewalling.)
Oct 02 07:22:34 np0005465596.novalocal systemd[1063]: Started podman-8004.scope.
Oct 02 07:22:34 np0005465596.novalocal systemd[1063]: Started podman-pause-0d6e47fe.scope.
Oct 02 07:22:34 np0005465596.novalocal sudo[8811]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddybbxrpadtloivpmafczqjldgxrddsn ; /usr/bin/python3'
Oct 02 07:22:34 np0005465596.novalocal sudo[8811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:22:35 np0005465596.novalocal python3[8840]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                      location = "38.102.83.65:5001"
                                                      insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                      location = "38.102.83.65:5001"
                                                      insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:22:35 np0005465596.novalocal sudo[8811]: pam_unix(sudo:session): session closed for user root
Oct 02 07:22:35 np0005465596.novalocal sshd-session[4661]: Connection closed by 38.102.83.114 port 35672
Oct 02 07:22:35 np0005465596.novalocal sshd-session[4658]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:22:35 np0005465596.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Oct 02 07:22:35 np0005465596.novalocal systemd[1]: session-5.scope: Consumed 1min 364ms CPU time.
Oct 02 07:22:35 np0005465596.novalocal systemd-logind[827]: Session 5 logged out. Waiting for processes to exit.
Oct 02 07:22:35 np0005465596.novalocal systemd-logind[827]: Removed session 5.
Oct 02 07:22:36 np0005465596.novalocal sshd-session[9732]: Connection closed by 193.32.162.151 port 33010
Oct 02 07:22:53 np0005465596.novalocal sshd-session[16123]: Connection closed by 38.102.83.120 port 54740 [preauth]
Oct 02 07:22:53 np0005465596.novalocal sshd-session[16124]: Connection closed by 38.102.83.120 port 54746 [preauth]
Oct 02 07:22:53 np0005465596.novalocal sshd-session[16129]: Unable to negotiate with 38.102.83.120 port 54754: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Oct 02 07:22:53 np0005465596.novalocal sshd-session[16127]: Unable to negotiate with 38.102.83.120 port 54764: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Oct 02 07:22:53 np0005465596.novalocal sshd-session[16131]: Unable to negotiate with 38.102.83.120 port 54780: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Oct 02 07:22:58 np0005465596.novalocal sshd-session[17758]: Accepted publickey for zuul from 38.102.83.114 port 58606 ssh2: RSA SHA256:keY65lHhwVneQ22wb0gGakhmLa8aDARUTSvk0mdY3D0
Oct 02 07:22:58 np0005465596.novalocal systemd-logind[827]: New session 6 of user zuul.
Oct 02 07:22:58 np0005465596.novalocal systemd[1]: Started Session 6 of User zuul.
Oct 02 07:22:58 np0005465596.novalocal sshd-session[17758]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:22:59 np0005465596.novalocal python3[17859]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFTccvmtTEFt0esG+GQT9xjCtZX41X/iqoL9ZYj4Xg6IUZR0zsf6XEEbbATfdjROiA7pIoSUe3SAji6Ty7S4068= zuul@np0005465595.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:22:59 np0005465596.novalocal sudo[18011]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcocvtfzvosxobuanqnjmjrqfzkyekpz ; /usr/bin/python3'
Oct 02 07:22:59 np0005465596.novalocal sudo[18011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:22:59 np0005465596.novalocal python3[18020]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFTccvmtTEFt0esG+GQT9xjCtZX41X/iqoL9ZYj4Xg6IUZR0zsf6XEEbbATfdjROiA7pIoSUe3SAji6Ty7S4068= zuul@np0005465595.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:22:59 np0005465596.novalocal sudo[18011]: pam_unix(sudo:session): session closed for user root
Oct 02 07:23:00 np0005465596.novalocal sudo[18304]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnipnggbvpdsewmjmhzhwqrprdugrmbl ; /usr/bin/python3'
Oct 02 07:23:00 np0005465596.novalocal sudo[18304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:23:00 np0005465596.novalocal python3[18314]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005465596.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct 02 07:23:00 np0005465596.novalocal useradd[18385]: new group: name=cloud-admin, GID=1002
Oct 02 07:23:00 np0005465596.novalocal useradd[18385]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Oct 02 07:23:00 np0005465596.novalocal sudo[18304]: pam_unix(sudo:session): session closed for user root
Oct 02 07:23:00 np0005465596.novalocal sudo[18522]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmxigodczmikxlyrtbejfkvrvmakkugw ; /usr/bin/python3'
Oct 02 07:23:00 np0005465596.novalocal sudo[18522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:23:00 np0005465596.novalocal python3[18532]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFTccvmtTEFt0esG+GQT9xjCtZX41X/iqoL9ZYj4Xg6IUZR0zsf6XEEbbATfdjROiA7pIoSUe3SAji6Ty7S4068= zuul@np0005465595.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 02 07:23:01 np0005465596.novalocal sudo[18522]: pam_unix(sudo:session): session closed for user root
Oct 02 07:23:01 np0005465596.novalocal sudo[18776]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdoqgiihceyttlrhdzbxrdhfwtvdfdqf ; /usr/bin/python3'
Oct 02 07:23:01 np0005465596.novalocal sudo[18776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:23:01 np0005465596.novalocal python3[18786]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:23:01 np0005465596.novalocal sudo[18776]: pam_unix(sudo:session): session closed for user root
Oct 02 07:23:01 np0005465596.novalocal sudo[19019]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haaolumzwrssdpifodsqzihvjgqrullo ; /usr/bin/python3'
Oct 02 07:23:01 np0005465596.novalocal sudo[19019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:23:02 np0005465596.novalocal python3[19029]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759389781.1736953-151-10262615780375/source _original_basename=tmpnraxozt5 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:23:02 np0005465596.novalocal sudo[19019]: pam_unix(sudo:session): session closed for user root
Oct 02 07:23:02 np0005465596.novalocal sudo[19335]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwkzvktdyhuxzklzodjjretumlxkdhwj ; /usr/bin/python3'
Oct 02 07:23:02 np0005465596.novalocal sudo[19335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:23:03 np0005465596.novalocal python3[19347]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Oct 02 07:23:03 np0005465596.novalocal systemd[1]: Starting Hostname Service...
Oct 02 07:23:03 np0005465596.novalocal systemd[1]: Started Hostname Service.
Oct 02 07:23:03 np0005465596.novalocal systemd-hostnamed[19436]: Changed pretty hostname to 'compute-0'
Oct 02 07:23:03 compute-0 systemd-hostnamed[19436]: Hostname set to <compute-0> (static)
Oct 02 07:23:03 compute-0 NetworkManager[3947]: <info>  [1759389783.2428] hostname: static hostname changed from "np0005465596.novalocal" to "compute-0"
Oct 02 07:23:03 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 02 07:23:03 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 02 07:23:03 compute-0 sudo[19335]: pam_unix(sudo:session): session closed for user root
Oct 02 07:23:03 compute-0 sshd-session[17803]: Connection closed by 38.102.83.114 port 58606
Oct 02 07:23:03 compute-0 sshd-session[17758]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:23:03 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Oct 02 07:23:03 compute-0 systemd[1]: session-6.scope: Consumed 2.740s CPU time.
Oct 02 07:23:03 compute-0 systemd-logind[827]: Session 6 logged out. Waiting for processes to exit.
Oct 02 07:23:03 compute-0 systemd-logind[827]: Removed session 6.
Oct 02 07:23:13 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 02 07:23:27 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 02 07:23:27 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 02 07:23:27 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1min 9.212s CPU time.
Oct 02 07:23:27 compute-0 systemd[1]: run-rf440368740ec45f1bb13b75924a328b5.service: Deactivated successfully.
Oct 02 07:23:33 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 02 07:23:42 compute-0 sshd-session[26577]: Received disconnect from 193.46.255.103 port 37496:11:  [preauth]
Oct 02 07:23:42 compute-0 sshd-session[26577]: Disconnected from authenticating user root 193.46.255.103 port 37496 [preauth]
Oct 02 07:24:15 compute-0 sshd-session[26579]: Invalid user config from 213.59.165.109 port 52601
Oct 02 07:24:16 compute-0 sshd-session[26579]: Connection closed by invalid user config 213.59.165.109 port 52601 [preauth]
Oct 02 07:26:30 compute-0 sshd-session[26584]: Accepted publickey for zuul from 38.102.83.120 port 56280 ssh2: RSA SHA256:keY65lHhwVneQ22wb0gGakhmLa8aDARUTSvk0mdY3D0
Oct 02 07:26:30 compute-0 systemd-logind[827]: New session 7 of user zuul.
Oct 02 07:26:30 compute-0 systemd[1]: Started Session 7 of User zuul.
Oct 02 07:26:30 compute-0 sshd-session[26584]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:26:30 compute-0 python3[26660]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:26:32 compute-0 sudo[26774]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqsddelyvkpoqamukyieqpcrsulvprdo ; /usr/bin/python3'
Oct 02 07:26:32 compute-0 sudo[26774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:32 compute-0 python3[26776]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:26:32 compute-0 sudo[26774]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:32 compute-0 sudo[26847]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwecjyugsinuelttsrwwgpwqmzcubvkp ; /usr/bin/python3'
Oct 02 07:26:32 compute-0 sudo[26847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:32 compute-0 python3[26849]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759389992.0843246-30382-8822230609216/source mode=0755 _original_basename=delorean.repo follow=False checksum=bb4c2ff9dad546f135d54d9729ea11b84117755d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:26:33 compute-0 sudo[26847]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:33 compute-0 sudo[26873]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ervveqhmhoeqbcvmnkszkqexnndbrkoo ; /usr/bin/python3'
Oct 02 07:26:33 compute-0 sudo[26873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:33 compute-0 python3[26875]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:26:33 compute-0 sudo[26873]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:33 compute-0 sudo[26946]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euvrrjrmelrelqpfnjngubyqkxdvmsmq ; /usr/bin/python3'
Oct 02 07:26:33 compute-0 sudo[26946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:33 compute-0 python3[26948]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759389992.0843246-30382-8822230609216/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:26:33 compute-0 sudo[26946]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:33 compute-0 sudo[26972]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsrnswcscagoeyaqzigitxqahpcwseym ; /usr/bin/python3'
Oct 02 07:26:33 compute-0 sudo[26972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:33 compute-0 python3[26974]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:26:33 compute-0 sudo[26972]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:34 compute-0 sudo[27045]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvkjldwwbvyzfcdkliauxvunaycrnpra ; /usr/bin/python3'
Oct 02 07:26:34 compute-0 sudo[27045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:34 compute-0 python3[27047]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759389992.0843246-30382-8822230609216/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:26:34 compute-0 sudo[27045]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:34 compute-0 sudo[27071]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pajpmfktmbjkwigzgiabonructhtxfug ; /usr/bin/python3'
Oct 02 07:26:34 compute-0 sudo[27071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:34 compute-0 python3[27073]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:26:34 compute-0 sudo[27071]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:34 compute-0 sudo[27144]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnsplbxgvibjtvmzsggpkftifbwfmuzu ; /usr/bin/python3'
Oct 02 07:26:34 compute-0 sudo[27144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:35 compute-0 python3[27146]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759389992.0843246-30382-8822230609216/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:26:35 compute-0 sudo[27144]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:35 compute-0 sudo[27170]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owbrodjulizgzamsshpiuwuczbzmczal ; /usr/bin/python3'
Oct 02 07:26:35 compute-0 sudo[27170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:35 compute-0 python3[27172]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:26:35 compute-0 sudo[27170]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:35 compute-0 sudo[27243]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeoomzqjgpjbpbggybieadrwltjujutc ; /usr/bin/python3'
Oct 02 07:26:35 compute-0 sudo[27243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:35 compute-0 python3[27245]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759389992.0843246-30382-8822230609216/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:26:35 compute-0 sudo[27243]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:35 compute-0 sudo[27269]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxowcnckoyxaflprrqhhhypfccybkohy ; /usr/bin/python3'
Oct 02 07:26:35 compute-0 sudo[27269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:36 compute-0 python3[27271]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:26:36 compute-0 sudo[27269]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:36 compute-0 sudo[27342]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdizuyzjrehdijwutcrsoukuvlwwchpt ; /usr/bin/python3'
Oct 02 07:26:36 compute-0 sudo[27342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:36 compute-0 python3[27344]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759389992.0843246-30382-8822230609216/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:26:36 compute-0 sudo[27342]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:36 compute-0 sudo[27368]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffnbvfzyjzdoltfuywpfuufqhacfsgiy ; /usr/bin/python3'
Oct 02 07:26:36 compute-0 sudo[27368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:36 compute-0 python3[27370]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 02 07:26:36 compute-0 sudo[27368]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:37 compute-0 sudo[27441]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvcxvidudkuqfnmpvbcfmbgwekvjedgw ; /usr/bin/python3'
Oct 02 07:26:37 compute-0 sudo[27441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:26:37 compute-0 python3[27443]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759389992.0843246-30382-8822230609216/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=d911291791b114a72daf18f370e91cb1ae300933 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:26:37 compute-0 sudo[27441]: pam_unix(sudo:session): session closed for user root
Oct 02 07:26:39 compute-0 sshd-session[27468]: Connection closed by 192.168.122.11 port 43012 [preauth]
Oct 02 07:26:39 compute-0 sshd-session[27471]: Unable to negotiate with 192.168.122.11 port 43028: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Oct 02 07:26:39 compute-0 sshd-session[27469]: Connection closed by 192.168.122.11 port 43024 [preauth]
Oct 02 07:26:39 compute-0 sshd-session[27470]: Unable to negotiate with 192.168.122.11 port 43026: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Oct 02 07:26:39 compute-0 sshd-session[27473]: Unable to negotiate with 192.168.122.11 port 43040: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Oct 02 07:27:28 compute-0 sshd-session[27479]: error: kex_exchange_identification: read: Connection reset by peer
Oct 02 07:27:28 compute-0 sshd-session[27479]: Connection reset by 45.140.17.97 port 9469
Oct 02 07:27:28 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Oct 02 07:27:28 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct 02 07:27:28 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Oct 02 07:27:28 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct 02 07:27:37 compute-0 PackageKit[6175]: daemon quit
Oct 02 07:27:37 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 02 07:28:04 compute-0 python3[27507]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:31:05 compute-0 sshd-session[27511]: Received disconnect from 141.98.10.225 port 21140:11:  [preauth]
Oct 02 07:31:05 compute-0 sshd-session[27511]: Disconnected from authenticating user root 141.98.10.225 port 21140 [preauth]
Oct 02 07:33:04 compute-0 sshd-session[26587]: Received disconnect from 38.102.83.120 port 56280:11: disconnected by user
Oct 02 07:33:04 compute-0 sshd-session[26587]: Disconnected from user zuul 38.102.83.120 port 56280
Oct 02 07:33:04 compute-0 sshd-session[26584]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:33:04 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Oct 02 07:33:04 compute-0 systemd[1]: session-7.scope: Consumed 5.759s CPU time.
Oct 02 07:33:04 compute-0 systemd-logind[827]: Session 7 logged out. Waiting for processes to exit.
Oct 02 07:33:04 compute-0 systemd-logind[827]: Removed session 7.
Oct 02 07:37:20 compute-0 sshd-session[27515]: Received disconnect from 91.224.92.108 port 43544:11:  [preauth]
Oct 02 07:37:20 compute-0 sshd-session[27515]: Disconnected from authenticating user root 91.224.92.108 port 43544 [preauth]
Oct 02 07:37:24 compute-0 sshd-session[27517]: Invalid user unknown from 94.101.25.93 port 48229
Oct 02 07:37:26 compute-0 sshd-session[27519]: Connection closed by 193.95.24.116 port 45252
Oct 02 07:39:16 compute-0 sshd-session[27520]: Accepted publickey for zuul from 192.168.122.30 port 54834 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:39:16 compute-0 systemd-logind[827]: New session 8 of user zuul.
Oct 02 07:39:16 compute-0 systemd[1]: Started Session 8 of User zuul.
Oct 02 07:39:16 compute-0 sshd-session[27520]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:39:17 compute-0 python3.9[27673]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:39:18 compute-0 sudo[27853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfhvjooeqbtynkhxvudgcywcxcnumvrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390758.070955-44-261319573040555/AnsiballZ_command.py'
Oct 02 07:39:18 compute-0 sudo[27853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:39:18 compute-0 python3.9[27855]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:39:23 compute-0 sshd[1011]: Timeout before authentication for connection from 94.101.25.93 to 38.102.83.73, pid = 27517
Oct 02 07:39:25 compute-0 sudo[27853]: pam_unix(sudo:session): session closed for user root
Oct 02 07:39:26 compute-0 sshd-session[27523]: Connection closed by 192.168.122.30 port 54834
Oct 02 07:39:26 compute-0 sshd-session[27520]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:39:26 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Oct 02 07:39:26 compute-0 systemd[1]: session-8.scope: Consumed 8.018s CPU time.
Oct 02 07:39:26 compute-0 systemd-logind[827]: Session 8 logged out. Waiting for processes to exit.
Oct 02 07:39:26 compute-0 systemd-logind[827]: Removed session 8.
Oct 02 07:39:31 compute-0 sshd-session[27912]: Accepted publickey for zuul from 192.168.122.30 port 42064 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:39:31 compute-0 systemd-logind[827]: New session 9 of user zuul.
Oct 02 07:39:31 compute-0 systemd[1]: Started Session 9 of User zuul.
Oct 02 07:39:31 compute-0 sshd-session[27912]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:39:32 compute-0 python3.9[28065]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:39:33 compute-0 sshd-session[27915]: Connection closed by 192.168.122.30 port 42064
Oct 02 07:39:33 compute-0 sshd-session[27912]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:39:33 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Oct 02 07:39:33 compute-0 systemd-logind[827]: Session 9 logged out. Waiting for processes to exit.
Oct 02 07:39:33 compute-0 systemd-logind[827]: Removed session 9.
Oct 02 07:39:48 compute-0 sshd-session[28094]: Accepted publickey for zuul from 192.168.122.30 port 53666 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:39:48 compute-0 systemd-logind[827]: New session 10 of user zuul.
Oct 02 07:39:48 compute-0 systemd[1]: Started Session 10 of User zuul.
Oct 02 07:39:48 compute-0 sshd-session[28094]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:39:49 compute-0 python3.9[28247]: ansible-ansible.legacy.ping Invoked with data=pong
Oct 02 07:39:51 compute-0 python3.9[28421]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:39:51 compute-0 sudo[28571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caqjzfdfzpktghlveycuxkuyngkdbgtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390791.3996205-69-77814140947127/AnsiballZ_command.py'
Oct 02 07:39:51 compute-0 sudo[28571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:39:52 compute-0 python3.9[28573]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:39:52 compute-0 sudo[28571]: pam_unix(sudo:session): session closed for user root
Oct 02 07:39:53 compute-0 sudo[28724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjlmurysliwklowjmfppyntuszfylbur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390792.6209307-93-188617350300264/AnsiballZ_stat.py'
Oct 02 07:39:53 compute-0 sudo[28724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:39:53 compute-0 python3.9[28726]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:39:53 compute-0 sudo[28724]: pam_unix(sudo:session): session closed for user root
Oct 02 07:39:53 compute-0 sudo[28876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftgyqhovowuozhfpjrjvmzhrulzcnvpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390793.4687374-109-185566767578326/AnsiballZ_file.py'
Oct 02 07:39:53 compute-0 sudo[28876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:39:54 compute-0 python3.9[28878]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:39:54 compute-0 sudo[28876]: pam_unix(sudo:session): session closed for user root
Oct 02 07:39:54 compute-0 sudo[29028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phxqatlmwnrqvsdvxenknkafeprcnjrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390794.3564053-125-114005159333267/AnsiballZ_stat.py'
Oct 02 07:39:54 compute-0 sudo[29028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:39:54 compute-0 python3.9[29030]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:39:54 compute-0 sudo[29028]: pam_unix(sudo:session): session closed for user root
Oct 02 07:39:55 compute-0 sudo[29151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxnuskggbexfzcrcxqwvxwxucblurtxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390794.3564053-125-114005159333267/AnsiballZ_copy.py'
Oct 02 07:39:55 compute-0 sudo[29151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:39:55 compute-0 python3.9[29153]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759390794.3564053-125-114005159333267/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:39:55 compute-0 sudo[29151]: pam_unix(sudo:session): session closed for user root
Oct 02 07:39:56 compute-0 sudo[29303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxfdtfovnmtxdalycegzgyqmuxhhnube ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390795.838902-155-118901426149708/AnsiballZ_setup.py'
Oct 02 07:39:56 compute-0 sudo[29303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:39:56 compute-0 python3.9[29305]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:39:56 compute-0 sudo[29303]: pam_unix(sudo:session): session closed for user root
Oct 02 07:39:57 compute-0 sudo[29459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdhcbzdgubvaqcxnxpjigikniuxeduub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390796.9291308-171-11992917019040/AnsiballZ_file.py'
Oct 02 07:39:57 compute-0 sudo[29459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:39:57 compute-0 python3.9[29461]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:39:57 compute-0 sudo[29459]: pam_unix(sudo:session): session closed for user root
Oct 02 07:39:58 compute-0 python3.9[29611]: ansible-ansible.builtin.service_facts Invoked
Oct 02 07:40:02 compute-0 python3.9[29866]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:40:03 compute-0 python3.9[30016]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:40:04 compute-0 python3.9[30170]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:40:05 compute-0 sudo[30326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glvecxrrufwrqtfluhnjsiixatetmyyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390805.059648-267-58382261436338/AnsiballZ_setup.py'
Oct 02 07:40:05 compute-0 sudo[30326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:40:05 compute-0 python3.9[30328]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:40:06 compute-0 sudo[30326]: pam_unix(sudo:session): session closed for user root
Oct 02 07:40:06 compute-0 sudo[30410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztnkjyiqenkbglhjbuknnecxgdqbubau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390805.059648-267-58382261436338/AnsiballZ_dnf.py'
Oct 02 07:40:06 compute-0 sudo[30410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:40:06 compute-0 python3.9[30412]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:40:50 compute-0 systemd[1]: Reloading.
Oct 02 07:40:50 compute-0 systemd-rc-local-generator[30611]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:40:50 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct 02 07:40:51 compute-0 systemd[1]: Reloading.
Oct 02 07:40:51 compute-0 systemd-rc-local-generator[30652]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:40:51 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct 02 07:40:51 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct 02 07:40:51 compute-0 systemd[1]: Reloading.
Oct 02 07:40:51 compute-0 systemd-rc-local-generator[30689]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:40:51 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Oct 02 07:40:52 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Oct 02 07:40:52 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Oct 02 07:40:52 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Oct 02 07:41:10 compute-0 sshd-session[30767]: Connection closed by 193.32.162.151 port 60340
Oct 02 07:41:17 compute-0 sshd-session[30788]: Invalid user admin from 178.250.191.189 port 53970
Oct 02 07:41:17 compute-0 sshd-session[30788]: Connection closed by invalid user admin 178.250.191.189 port 53970 [preauth]
Oct 02 07:41:57 compute-0 kernel: SELinux:  Converting 2713 SID table entries...
Oct 02 07:41:57 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 02 07:41:57 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 02 07:41:57 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 02 07:41:57 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 02 07:41:57 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 02 07:41:57 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 02 07:41:57 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 02 07:41:57 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct 02 07:41:58 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 02 07:41:58 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 02 07:41:58 compute-0 systemd[1]: Reloading.
Oct 02 07:41:58 compute-0 systemd-rc-local-generator[31007]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:41:58 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 02 07:41:58 compute-0 systemd[1]: Starting PackageKit Daemon...
Oct 02 07:41:58 compute-0 PackageKit[31176]: daemon start
Oct 02 07:41:58 compute-0 systemd[1]: Started PackageKit Daemon.
Oct 02 07:41:58 compute-0 sudo[30410]: pam_unix(sudo:session): session closed for user root
Oct 02 07:41:59 compute-0 sudo[31924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oonvlptdtetteluppzssxjmnnntpehbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390919.1156802-291-275364349332324/AnsiballZ_command.py'
Oct 02 07:41:59 compute-0 sudo[31924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:41:59 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 02 07:41:59 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 02 07:41:59 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.589s CPU time.
Oct 02 07:41:59 compute-0 systemd[1]: run-r94aaf9840f3049bc88837fb3a9545dae.service: Deactivated successfully.
Oct 02 07:41:59 compute-0 python3.9[31926]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:42:00 compute-0 sudo[31924]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:01 compute-0 sudo[32206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nftijtjxrpdzgdnihmepfvutalulaqpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390921.0915687-307-65862508869308/AnsiballZ_selinux.py'
Oct 02 07:42:01 compute-0 sudo[32206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:02 compute-0 python3.9[32208]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct 02 07:42:02 compute-0 sudo[32206]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:02 compute-0 sudo[32358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yisrxiywhzjwvygfzlvthmirjiflhymn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390922.6136417-329-141007741981981/AnsiballZ_command.py'
Oct 02 07:42:02 compute-0 sudo[32358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:03 compute-0 python3.9[32360]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct 02 07:42:04 compute-0 sudo[32358]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:04 compute-0 sudo[32513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqbdidpjiddbjbcvwoabkkkscnvaowid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390924.2683847-345-192836317348052/AnsiballZ_file.py'
Oct 02 07:42:04 compute-0 sudo[32513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:06 compute-0 python3.9[32515]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:42:06 compute-0 sudo[32513]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:07 compute-0 sudo[32665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buetrocgbtlfoeoujttpibkidggyxlfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390926.8665245-361-240805109322528/AnsiballZ_mount.py'
Oct 02 07:42:07 compute-0 sudo[32665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:07 compute-0 python3.9[32667]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct 02 07:42:07 compute-0 sudo[32665]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:09 compute-0 sudo[32817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjwcydggqzjrkavfdrrykgzngnbeqqco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390928.6437924-417-80985424135980/AnsiballZ_file.py'
Oct 02 07:42:09 compute-0 sudo[32817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:09 compute-0 python3.9[32819]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:42:09 compute-0 sudo[32817]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:09 compute-0 sudo[32969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzdgbhuwpejikwhzsmadzodrexyffwrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390929.4923205-433-5567743316321/AnsiballZ_stat.py'
Oct 02 07:42:09 compute-0 sudo[32969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:14 compute-0 python3.9[32971]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:42:14 compute-0 sudo[32969]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:15 compute-0 sudo[33092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jblnxugbskumxbpshplxhcoketroweys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390929.4923205-433-5567743316321/AnsiballZ_copy.py'
Oct 02 07:42:15 compute-0 sudo[33092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:15 compute-0 python3.9[33094]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759390929.4923205-433-5567743316321/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eca61761ba526b6f995456bd5ca7bb1b26d84647 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:42:15 compute-0 sudo[33092]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:16 compute-0 sudo[33244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khubvmccnijqzsbuveniuhhdalujzbqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390936.2918644-487-121705404624612/AnsiballZ_getent.py'
Oct 02 07:42:16 compute-0 sudo[33244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:17 compute-0 python3.9[33246]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct 02 07:42:17 compute-0 sudo[33244]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:17 compute-0 sudo[33397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvwzualagxhneippumnubkikvzspdbsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390937.355636-503-213680338619500/AnsiballZ_group.py'
Oct 02 07:42:17 compute-0 sudo[33397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:18 compute-0 python3.9[33399]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 02 07:42:18 compute-0 groupadd[33400]: group added to /etc/group: name=qemu, GID=107
Oct 02 07:42:18 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 02 07:42:18 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 02 07:42:18 compute-0 groupadd[33400]: group added to /etc/gshadow: name=qemu
Oct 02 07:42:18 compute-0 groupadd[33400]: new group: name=qemu, GID=107
Oct 02 07:42:18 compute-0 sudo[33397]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:18 compute-0 sudo[33556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eknqaoliicaorscbmmillglhqirrfiir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390938.34249-519-201076757277324/AnsiballZ_user.py'
Oct 02 07:42:18 compute-0 sudo[33556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:19 compute-0 python3.9[33558]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 02 07:42:19 compute-0 useradd[33560]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Oct 02 07:42:19 compute-0 sudo[33556]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:19 compute-0 sudo[33716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thpcrafgitnzhyqhxqmatfvrfwlmsozs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390939.5278552-535-180597211363729/AnsiballZ_getent.py'
Oct 02 07:42:19 compute-0 sudo[33716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:20 compute-0 python3.9[33718]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct 02 07:42:20 compute-0 sudo[33716]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:20 compute-0 sudo[33869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiuccwvuteqlixkopemnfakyoguxqpwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390940.3278995-551-94177047368471/AnsiballZ_group.py'
Oct 02 07:42:20 compute-0 sudo[33869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:20 compute-0 python3.9[33871]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 02 07:42:20 compute-0 groupadd[33872]: group added to /etc/group: name=hugetlbfs, GID=42477
Oct 02 07:42:20 compute-0 groupadd[33872]: group added to /etc/gshadow: name=hugetlbfs
Oct 02 07:42:20 compute-0 groupadd[33872]: new group: name=hugetlbfs, GID=42477
Oct 02 07:42:20 compute-0 sudo[33869]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:21 compute-0 sudo[34027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brirogaumngffdocxwijrfnmeyhplzjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390941.2317233-569-84382523370941/AnsiballZ_file.py'
Oct 02 07:42:21 compute-0 sudo[34027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:21 compute-0 python3.9[34029]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct 02 07:42:21 compute-0 sudo[34027]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:22 compute-0 sudo[34179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqqkokyrsdvahjhhbhxzzzanffwgeyeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390942.2277813-591-235706786388558/AnsiballZ_dnf.py'
Oct 02 07:42:22 compute-0 sudo[34179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:22 compute-0 python3.9[34181]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:42:23 compute-0 sshd-session[34183]: banner exchange: Connection from 64.62.197.47 port 40428: invalid format
Oct 02 07:42:24 compute-0 sudo[34179]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:24 compute-0 sudo[34333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhhroupfislxcigqijyzoomplpcloryx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390944.5918355-607-173338797451867/AnsiballZ_file.py'
Oct 02 07:42:24 compute-0 sudo[34333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:25 compute-0 python3.9[34335]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:42:25 compute-0 sudo[34333]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:25 compute-0 sudo[34485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdifeavowyovvevzgclanrihfrxvzbuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390945.3704298-623-87875055829688/AnsiballZ_stat.py'
Oct 02 07:42:25 compute-0 sudo[34485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:26 compute-0 python3.9[34487]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:42:26 compute-0 sudo[34485]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:26 compute-0 sudo[34608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzikcfilcywikwbexrzllfxhtzzrtshw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390945.3704298-623-87875055829688/AnsiballZ_copy.py'
Oct 02 07:42:26 compute-0 sudo[34608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:26 compute-0 python3.9[34610]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759390945.3704298-623-87875055829688/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:42:26 compute-0 sudo[34608]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:27 compute-0 sudo[34760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfxtwopjabetbashdeoyttadorrzvqeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390946.8565795-653-216362615765827/AnsiballZ_systemd.py'
Oct 02 07:42:27 compute-0 sudo[34760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:27 compute-0 python3.9[34762]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:42:27 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 02 07:42:28 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct 02 07:42:28 compute-0 kernel: Bridge firewalling registered
Oct 02 07:42:28 compute-0 systemd-modules-load[34766]: Inserted module 'br_netfilter'
Oct 02 07:42:28 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 02 07:42:28 compute-0 sudo[34760]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:28 compute-0 sudo[34919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhbgdjtuwzvvbkuisrotjfhxqcbnlvdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390948.3063023-669-240919493345363/AnsiballZ_stat.py'
Oct 02 07:42:28 compute-0 sudo[34919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:28 compute-0 python3.9[34921]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:42:28 compute-0 sudo[34919]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:29 compute-0 sudo[35042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovgflkcgstslrxrpocwecpdsrfxngwtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390948.3063023-669-240919493345363/AnsiballZ_copy.py'
Oct 02 07:42:29 compute-0 sudo[35042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:29 compute-0 python3.9[35044]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759390948.3063023-669-240919493345363/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:42:29 compute-0 sudo[35042]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:30 compute-0 sudo[35194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ausrvecpjkmjhqjfolqcdvveoiprhcwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390950.0865076-705-231533817798042/AnsiballZ_dnf.py'
Oct 02 07:42:30 compute-0 sudo[35194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:30 compute-0 python3.9[35196]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:42:34 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Oct 02 07:42:34 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Oct 02 07:42:34 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 02 07:42:34 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 02 07:42:34 compute-0 systemd[1]: Reloading.
Oct 02 07:42:35 compute-0 systemd-rc-local-generator[35253]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:42:35 compute-0 systemd[1]: Starting dnf makecache...
Oct 02 07:42:35 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 02 07:42:35 compute-0 dnf[35270]: Failed determining last makecache time.
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-openstack-barbican-42b4c41831408a8e323 108 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 145 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-openstack-cinder-1c00d6490d88e436f26ef 153 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-python-stevedore-c4acc5639fd2329372142 151 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-python-cloudkitty-tests-tempest-3961dc 139 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-os-net-config-28598c2978b9e2207dd19fc4 159 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 157 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-python-designate-tests-tempest-347fdbc 153 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-openstack-glance-1fd12c29b339f30fe823e 143 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 143 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-openstack-manila-3c01b7181572c95dac462 154 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 sudo[35194]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-python-whitebox-neutron-tests-tempest- 137 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-openstack-octavia-ba397f07a7331190208c 151 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-openstack-watcher-c014f81a8647287f6dcc 147 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-edpm-image-builder-55ba53cf215b14ed95b 152 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 164 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-openstack-swift-dc98a8463506ac520c469a 155 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-python-tempestconf-8515371b7cceebd4282 154 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: delorean-openstack-heat-ui-013accbfd179753bc3f0 160 kB/s | 3.0 kB     00:00
Oct 02 07:42:35 compute-0 dnf[35270]: CentOS Stream 9 - BaseOS                         72 kB/s | 6.7 kB     00:00
Oct 02 07:42:36 compute-0 dnf[35270]: CentOS Stream 9 - AppStream                      27 kB/s | 6.8 kB     00:00
Oct 02 07:42:36 compute-0 dnf[35270]: CentOS Stream 9 - CRB                            76 kB/s | 6.6 kB     00:00
Oct 02 07:42:36 compute-0 dnf[35270]: CentOS Stream 9 - Extras packages                85 kB/s | 8.0 kB     00:00
Oct 02 07:42:36 compute-0 dnf[35270]: dlrn-antelope-testing                           170 kB/s | 3.0 kB     00:00
Oct 02 07:42:36 compute-0 dnf[35270]: dlrn-antelope-build-deps                        163 kB/s | 3.0 kB     00:00
Oct 02 07:42:36 compute-0 dnf[35270]: centos9-rabbitmq                                 55 kB/s | 3.0 kB     00:00
Oct 02 07:42:36 compute-0 python3.9[36593]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:42:36 compute-0 dnf[35270]: centos9-storage                                 124 kB/s | 3.0 kB     00:00
Oct 02 07:42:36 compute-0 dnf[35270]: centos9-opstools                                132 kB/s | 3.0 kB     00:00
Oct 02 07:42:36 compute-0 dnf[35270]: NFV SIG OpenvSwitch                              57 kB/s | 3.0 kB     00:00
Oct 02 07:42:36 compute-0 dnf[35270]: repo-setup-centos-appstream                      85 kB/s | 4.4 kB     00:00
Oct 02 07:42:36 compute-0 dnf[35270]: repo-setup-centos-baseos                        170 kB/s | 3.9 kB     00:00
Oct 02 07:42:37 compute-0 dnf[35270]: repo-setup-centos-highavailability              163 kB/s | 3.9 kB     00:00
Oct 02 07:42:37 compute-0 dnf[35270]: repo-setup-centos-powertools                    211 kB/s | 4.3 kB     00:00
Oct 02 07:42:37 compute-0 dnf[35270]: Extra Packages for Enterprise Linux 9 - x86_64  192 kB/s |  28 kB     00:00
Oct 02 07:42:37 compute-0 python3.9[37597]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct 02 07:42:37 compute-0 dnf[35270]: Metadata cache created.
Oct 02 07:42:37 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct 02 07:42:37 compute-0 systemd[1]: Finished dnf makecache.
Oct 02 07:42:37 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.700s CPU time.
Oct 02 07:42:38 compute-0 python3.9[38330]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:42:38 compute-0 sudo[39127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvbnvpoblympxfephtmupcrlcdyjkiae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390958.5226378-783-145283814421705/AnsiballZ_command.py'
Oct 02 07:42:38 compute-0 sudo[39127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:39 compute-0 python3.9[39147]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:42:39 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 02 07:42:39 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 02 07:42:39 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 02 07:42:39 compute-0 systemd[1]: man-db-cache-update.service: Consumed 5.513s CPU time.
Oct 02 07:42:39 compute-0 systemd[1]: run-rc42889dfe8054ff28a537991a17c9e30.service: Deactivated successfully.
Oct 02 07:42:39 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 02 07:42:39 compute-0 sudo[39127]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:40 compute-0 sudo[39775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umwgwhxeymvnukrllrrlfjpmfnodbrmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390960.0583863-801-143867939359724/AnsiballZ_systemd.py'
Oct 02 07:42:40 compute-0 sudo[39775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:40 compute-0 python3.9[39777]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:42:40 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct 02 07:42:40 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Oct 02 07:42:40 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct 02 07:42:40 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 02 07:42:40 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 02 07:42:41 compute-0 sudo[39775]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:41 compute-0 python3.9[39939]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct 02 07:42:44 compute-0 sudo[40089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxwfjfcawndqvmajzgitrvadgcmrqmqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390964.3216896-915-23145800943236/AnsiballZ_systemd.py'
Oct 02 07:42:44 compute-0 sudo[40089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:44 compute-0 python3.9[40091]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:42:45 compute-0 systemd[1]: Reloading.
Oct 02 07:42:45 compute-0 systemd-rc-local-generator[40122]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:42:45 compute-0 sudo[40089]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:45 compute-0 sudo[40278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqtfglubofbzoxeuymdvtpkckuwneyim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390965.5623815-915-204923810920242/AnsiballZ_systemd.py'
Oct 02 07:42:45 compute-0 sudo[40278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:46 compute-0 python3.9[40280]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:42:46 compute-0 systemd[1]: Reloading.
Oct 02 07:42:46 compute-0 systemd-rc-local-generator[40310]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:42:46 compute-0 sudo[40278]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:47 compute-0 sudo[40467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpettbflhjgsvfushibyfxdfdagwayxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390966.875041-947-70403666806115/AnsiballZ_command.py'
Oct 02 07:42:47 compute-0 sudo[40467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:47 compute-0 python3.9[40469]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:42:47 compute-0 sudo[40467]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:48 compute-0 sudo[40620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utfuknjpkhcdytcuhsuzaifnaywpgdot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390967.6998494-963-67294928454906/AnsiballZ_command.py'
Oct 02 07:42:48 compute-0 sudo[40620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:48 compute-0 python3.9[40622]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:42:48 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct 02 07:42:48 compute-0 sudo[40620]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:48 compute-0 sudo[40773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdqqlbyefinomyywfjkvvsleywmcbaek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390968.5697913-979-184977992911808/AnsiballZ_command.py'
Oct 02 07:42:48 compute-0 sudo[40773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:49 compute-0 python3.9[40775]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:42:50 compute-0 sudo[40773]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:51 compute-0 sudo[40935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzjouitjokvihkftdgwndkjxunsbrint ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390970.8369827-995-177504219765493/AnsiballZ_command.py'
Oct 02 07:42:51 compute-0 sudo[40935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:51 compute-0 python3.9[40937]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:42:51 compute-0 sudo[40935]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:52 compute-0 sudo[41088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqhpgaymudgrfygxidijoszfacnultei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390971.690214-1011-13213064569079/AnsiballZ_systemd.py'
Oct 02 07:42:52 compute-0 sudo[41088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:42:52 compute-0 python3.9[41090]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:42:52 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 02 07:42:52 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Oct 02 07:42:52 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Oct 02 07:42:52 compute-0 systemd[1]: Starting Apply Kernel Variables...
Oct 02 07:42:52 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 02 07:42:52 compute-0 systemd[1]: Finished Apply Kernel Variables.
Oct 02 07:42:52 compute-0 sudo[41088]: pam_unix(sudo:session): session closed for user root
Oct 02 07:42:52 compute-0 sshd-session[28097]: Connection closed by 192.168.122.30 port 53666
Oct 02 07:42:52 compute-0 sshd-session[28094]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:42:52 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Oct 02 07:42:52 compute-0 systemd[1]: session-10.scope: Consumed 2min 17.307s CPU time.
Oct 02 07:42:52 compute-0 systemd-logind[827]: Session 10 logged out. Waiting for processes to exit.
Oct 02 07:42:52 compute-0 systemd-logind[827]: Removed session 10.
Oct 02 07:42:58 compute-0 sshd-session[41121]: Accepted publickey for zuul from 192.168.122.30 port 52772 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:42:58 compute-0 systemd-logind[827]: New session 11 of user zuul.
Oct 02 07:42:58 compute-0 systemd[1]: Started Session 11 of User zuul.
Oct 02 07:42:58 compute-0 sshd-session[41121]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:42:59 compute-0 python3.9[41274]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:43:00 compute-0 python3.9[41428]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:43:01 compute-0 sudo[41582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqmwlbkpucsbnjvshxciomwkfcikzzrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390981.3989909-80-151300856105357/AnsiballZ_command.py'
Oct 02 07:43:01 compute-0 sudo[41582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:02 compute-0 python3.9[41584]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:43:02 compute-0 sudo[41582]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:03 compute-0 python3.9[41735]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:43:04 compute-0 sudo[41889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwxybjlsjxdmnppxjjfktpcizxqqblaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390983.7603526-120-8545559908582/AnsiballZ_setup.py'
Oct 02 07:43:04 compute-0 sudo[41889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:04 compute-0 python3.9[41891]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:43:04 compute-0 sudo[41889]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:05 compute-0 sudo[41973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khaodwfmqklpzzghuzenxdgepxehyknv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390983.7603526-120-8545559908582/AnsiballZ_dnf.py'
Oct 02 07:43:05 compute-0 sudo[41973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:05 compute-0 python3.9[41975]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:43:06 compute-0 sudo[41973]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:07 compute-0 sudo[42126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enstcenfrerquktccrsexlgdrcjfbdns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390986.7304943-144-55394375338074/AnsiballZ_setup.py'
Oct 02 07:43:07 compute-0 sudo[42126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:07 compute-0 python3.9[42128]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:43:07 compute-0 sudo[42126]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:08 compute-0 sudo[42297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmopuyndatlpowyvvqlgoeedjpudqlog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390987.77512-166-162683612900118/AnsiballZ_file.py'
Oct 02 07:43:08 compute-0 sudo[42297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:08 compute-0 python3.9[42299]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:43:08 compute-0 sudo[42297]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:09 compute-0 sudo[42449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqysgknzeehwmbknyxfjhyocepuazwlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390988.7437985-182-266005249819832/AnsiballZ_command.py'
Oct 02 07:43:09 compute-0 sudo[42449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:09 compute-0 python3.9[42451]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:43:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2849803436-merged.mount: Deactivated successfully.
Oct 02 07:43:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1987357631-merged.mount: Deactivated successfully.
Oct 02 07:43:09 compute-0 podman[42452]: 2025-10-02 07:43:09.304453922 +0000 UTC m=+0.080753398 system refresh
Oct 02 07:43:09 compute-0 sudo[42449]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:10 compute-0 sudo[42612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wauargljdikpeyabgvhbuxbmyilnwzry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390989.558369-198-179566644035817/AnsiballZ_stat.py'
Oct 02 07:43:10 compute-0 sudo[42612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:10 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:43:10 compute-0 python3.9[42614]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:43:10 compute-0 sudo[42612]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:10 compute-0 sudo[42735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvbhgumnjglddxgybnnslzdsealqegib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390989.558369-198-179566644035817/AnsiballZ_copy.py'
Oct 02 07:43:10 compute-0 sudo[42735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:11 compute-0 python3.9[42737]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759390989.558369-198-179566644035817/.source.json follow=False _original_basename=podman_network_config.j2 checksum=632dbd107fc2f9a0998e57f85187de85dcfd896c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:43:11 compute-0 sudo[42735]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:11 compute-0 sudo[42887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpmzfynkwqcouhfjvcxlynztrzcogouu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390991.326316-228-230747071912862/AnsiballZ_stat.py'
Oct 02 07:43:11 compute-0 sudo[42887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:11 compute-0 python3.9[42889]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:43:11 compute-0 sudo[42887]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:12 compute-0 sudo[43010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-logdqvopwhtqhjjqlouslzkeukwdzofj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390991.326316-228-230747071912862/AnsiballZ_copy.py'
Oct 02 07:43:12 compute-0 sudo[43010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:12 compute-0 python3.9[43012]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759390991.326316-228-230747071912862/.source.conf follow=False _original_basename=registries.conf.j2 checksum=c7e24e791b23b6ca9af1b87173047a0fb53188da backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:43:12 compute-0 sudo[43010]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:13 compute-0 sudo[43162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcgckbdzkvzbvxlwxynuooesfozmgrtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390992.7514558-260-4303834078743/AnsiballZ_ini_file.py'
Oct 02 07:43:13 compute-0 sudo[43162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:13 compute-0 python3.9[43164]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:43:13 compute-0 sudo[43162]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:14 compute-0 sudo[43314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onvkguvrjdhnznuerlmfbvipbwnklmzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390993.7181363-260-218135404553487/AnsiballZ_ini_file.py'
Oct 02 07:43:14 compute-0 sudo[43314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:14 compute-0 python3.9[43316]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:43:14 compute-0 sudo[43314]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:14 compute-0 sudo[43466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyssnezfzfniuqqjooeujxzuntwosjxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390994.4345994-260-91316259936619/AnsiballZ_ini_file.py'
Oct 02 07:43:14 compute-0 sudo[43466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:15 compute-0 python3.9[43468]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:43:15 compute-0 sudo[43466]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:15 compute-0 sudo[43618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnjhrrufdyksjbcffkeqowdwhfvdezki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390995.2419724-260-878108859387/AnsiballZ_ini_file.py'
Oct 02 07:43:15 compute-0 sudo[43618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:15 compute-0 python3.9[43620]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:43:15 compute-0 sudo[43618]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:16 compute-0 python3.9[43770]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:43:17 compute-0 sudo[43922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuekqxhwyseyeozytrnqnmnwndrxfjpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390997.1097891-340-29121978108137/AnsiballZ_dnf.py'
Oct 02 07:43:17 compute-0 sudo[43922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:17 compute-0 python3.9[43924]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 02 07:43:18 compute-0 sudo[43922]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:19 compute-0 sudo[44075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufastwhvzvkyzolozxuqohohdpnsncav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759390998.9686446-356-71135540742347/AnsiballZ_dnf.py'
Oct 02 07:43:19 compute-0 sudo[44075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:19 compute-0 python3.9[44077]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 02 07:43:21 compute-0 sudo[44075]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:21 compute-0 sudo[44235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlurnwegihgzlcvrvcxafdmafbpvwyug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391001.4818027-376-113700784277968/AnsiballZ_dnf.py'
Oct 02 07:43:21 compute-0 sudo[44235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:22 compute-0 python3.9[44237]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 02 07:43:23 compute-0 sudo[44235]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:23 compute-0 sudo[44388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bibojymbgnzchhlqbqipoetpjsoxrbye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391003.5276659-394-40606073918753/AnsiballZ_dnf.py'
Oct 02 07:43:23 compute-0 sudo[44388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:24 compute-0 python3.9[44390]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 02 07:43:25 compute-0 sudo[44388]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:26 compute-0 sudo[44541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krckmfncdaqljkglougybqyxiewdyllf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391005.6859539-416-155777528464078/AnsiballZ_dnf.py'
Oct 02 07:43:26 compute-0 sudo[44541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:26 compute-0 python3.9[44543]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 02 07:43:27 compute-0 sudo[44541]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:28 compute-0 sudo[44697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxajgzpgfbfihfdvsldvlbcokefbywsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391008.2115362-432-220364641687998/AnsiballZ_dnf.py'
Oct 02 07:43:28 compute-0 sudo[44697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:28 compute-0 python3.9[44699]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 02 07:43:31 compute-0 sudo[44697]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:32 compute-0 sudo[44865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abruzczfiaxuxihafwgzcnwqweiwstqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391012.22287-450-170511242519915/AnsiballZ_dnf.py'
Oct 02 07:43:32 compute-0 sudo[44865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:32 compute-0 python3.9[44867]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 02 07:43:33 compute-0 sudo[44865]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:34 compute-0 sudo[45018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzmqwgdufgzbgdsebaoccjxorwkenhky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391014.2790082-468-164586181156255/AnsiballZ_dnf.py'
Oct 02 07:43:34 compute-0 sudo[45018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:34 compute-0 python3.9[45020]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 02 07:43:49 compute-0 sudo[45018]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:50 compute-0 sudo[45355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reptxbcbzfrkmhtjqehetqhwgnowlzor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391030.3717954-490-32133484229981/AnsiballZ_file.py'
Oct 02 07:43:50 compute-0 sudo[45355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:50 compute-0 python3.9[45357]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:43:50 compute-0 sudo[45355]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:51 compute-0 sudo[45530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnqwznrpolrurjwmnrqhbvkqsareqwhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391031.19352-506-51856119198124/AnsiballZ_stat.py'
Oct 02 07:43:51 compute-0 sudo[45530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:51 compute-0 python3.9[45532]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:43:51 compute-0 sudo[45530]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:52 compute-0 sudo[45653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iihywhuzqpyxqjwdmeyhyujgwciuiddj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391031.19352-506-51856119198124/AnsiballZ_copy.py'
Oct 02 07:43:52 compute-0 sudo[45653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:52 compute-0 python3.9[45655]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759391031.19352-506-51856119198124/.source.json _original_basename=.fxzrrbm8 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:43:52 compute-0 sudo[45653]: pam_unix(sudo:session): session closed for user root
Oct 02 07:43:53 compute-0 sudo[45805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sduabmavnqxukljfqofnyxqnfxokodzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391032.9722092-542-199944703655596/AnsiballZ_podman_image.py'
Oct 02 07:43:53 compute-0 sudo[45805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:43:53 compute-0 python3.9[45807]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 02 07:43:53 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:43:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat328347274-lower\x2dmapped.mount: Deactivated successfully.
Oct 02 07:44:00 compute-0 podman[45819]: 2025-10-02 07:44:00.410995472 +0000 UTC m=+6.628339644 image pull 81d94872551c3ae3c30801602bbb5f0c44872f15dcde472a0ba869fe2f28966e quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct 02 07:44:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:00 compute-0 sudo[45805]: pam_unix(sudo:session): session closed for user root
Oct 02 07:44:01 compute-0 sudo[46114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndyiaeytcgmruucavpoiymubidmkuuvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391040.997945-560-46281678969722/AnsiballZ_podman_image.py'
Oct 02 07:44:01 compute-0 sudo[46114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:44:01 compute-0 python3.9[46116]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 02 07:44:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:04 compute-0 podman[46128]: 2025-10-02 07:44:04.154716867 +0000 UTC m=+2.558442279 image pull ceb6fcca0131acbc0ff37d5322c126e14f8045fca848e7440fedac2d6444d8c2 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct 02 07:44:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:04 compute-0 sudo[46114]: pam_unix(sudo:session): session closed for user root
Oct 02 07:44:05 compute-0 sudo[46380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxemqlyiggttuhladfflbsppcufmqiey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391044.8289914-582-254974506854043/AnsiballZ_podman_image.py'
Oct 02 07:44:05 compute-0 sudo[46380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:44:05 compute-0 python3.9[46382]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 02 07:44:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:15 compute-0 podman[46394]: 2025-10-02 07:44:15.280655352 +0000 UTC m=+9.711927598 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 07:44:15 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:15 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:15 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:15 compute-0 sudo[46380]: pam_unix(sudo:session): session closed for user root
Oct 02 07:44:16 compute-0 sudo[46704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckstdgtmgimqsvxcrttykxomvbitnwhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391055.947295-602-253805786512052/AnsiballZ_podman_image.py'
Oct 02 07:44:16 compute-0 sudo[46704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:44:16 compute-0 python3.9[46706]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 02 07:44:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:17 compute-0 podman[46718]: 2025-10-02 07:44:17.54615704 +0000 UTC m=+1.047888849 image pull 4ee39d2b05f9d7d8e7f025baefe799c33619f4419f4eb27d17ca383a40343475 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct 02 07:44:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:17 compute-0 sudo[46704]: pam_unix(sudo:session): session closed for user root
Oct 02 07:44:18 compute-0 sudo[46945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqmpglpmjygppqzvkdaofarogycfzyto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391058.1775966-620-184419686377585/AnsiballZ_podman_image.py'
Oct 02 07:44:18 compute-0 sudo[46945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:44:18 compute-0 python3.9[46947]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 02 07:44:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:21 compute-0 sshd-session[46983]: Received disconnect from 193.46.255.7 port 16230:11:  [preauth]
Oct 02 07:44:21 compute-0 sshd-session[46983]: Disconnected from authenticating user root 193.46.255.7 port 16230 [preauth]
Oct 02 07:44:30 compute-0 podman[46960]: 2025-10-02 07:44:30.284381728 +0000 UTC m=+11.524858582 image pull cb7a9bebda1404fc92f1415580e7da04b5fcfd160582e38b9b99703a41ed1519 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct 02 07:44:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:30 compute-0 sudo[46945]: pam_unix(sudo:session): session closed for user root
Oct 02 07:44:31 compute-0 sudo[47222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blusiyumcmxiwbsnpddmkpsegcrcfuyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391070.869058-642-234017840690561/AnsiballZ_podman_image.py'
Oct 02 07:44:31 compute-0 sudo[47222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:44:31 compute-0 python3.9[47224]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 02 07:44:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:36 compute-0 podman[47236]: 2025-10-02 07:44:36.82162843 +0000 UTC m=+5.344382856 image pull 79bd76c8b82e725db5528fc603bf7cb7171545aa782e3febfc2255ff980f1ffb quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Oct 02 07:44:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:37 compute-0 sudo[47222]: pam_unix(sudo:session): session closed for user root
Oct 02 07:44:37 compute-0 sudo[47491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhejaiehlleqemebksyscujxoyvcfsso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391077.2703333-642-267564808217868/AnsiballZ_podman_image.py'
Oct 02 07:44:37 compute-0 sudo[47491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:44:37 compute-0 python3.9[47493]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 02 07:44:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:39 compute-0 podman[47507]: 2025-10-02 07:44:39.267829354 +0000 UTC m=+1.357123988 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct 02 07:44:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:44:39 compute-0 sudo[47491]: pam_unix(sudo:session): session closed for user root
Oct 02 07:44:41 compute-0 sshd-session[41124]: Connection closed by 192.168.122.30 port 52772
Oct 02 07:44:41 compute-0 sshd-session[41121]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:44:41 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Oct 02 07:44:41 compute-0 systemd[1]: session-11.scope: Consumed 1min 53.545s CPU time.
Oct 02 07:44:41 compute-0 systemd-logind[827]: Session 11 logged out. Waiting for processes to exit.
Oct 02 07:44:41 compute-0 systemd-logind[827]: Removed session 11.
Oct 02 07:44:46 compute-0 sshd-session[47655]: Accepted publickey for zuul from 192.168.122.30 port 45008 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:44:46 compute-0 systemd-logind[827]: New session 12 of user zuul.
Oct 02 07:44:46 compute-0 systemd[1]: Started Session 12 of User zuul.
Oct 02 07:44:46 compute-0 sshd-session[47655]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:44:47 compute-0 python3.9[47808]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:44:48 compute-0 sudo[47962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cngejynooalvxqspcbftrnrobfbwxnpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391088.2967618-52-62321366539743/AnsiballZ_getent.py'
Oct 02 07:44:48 compute-0 sudo[47962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:44:49 compute-0 python3.9[47964]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct 02 07:44:49 compute-0 sudo[47962]: pam_unix(sudo:session): session closed for user root
Oct 02 07:44:50 compute-0 sudo[48115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geovuhcbcrnvuehmpwqgfmitsscksrcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391090.249508-68-156603107018568/AnsiballZ_group.py'
Oct 02 07:44:50 compute-0 sudo[48115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:44:50 compute-0 python3.9[48117]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 02 07:44:50 compute-0 groupadd[48118]: group added to /etc/group: name=openvswitch, GID=42476
Oct 02 07:44:50 compute-0 groupadd[48118]: group added to /etc/gshadow: name=openvswitch
Oct 02 07:44:50 compute-0 groupadd[48118]: new group: name=openvswitch, GID=42476
Oct 02 07:44:51 compute-0 sudo[48115]: pam_unix(sudo:session): session closed for user root
Oct 02 07:44:51 compute-0 sudo[48273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdhguntefwzxwmhbxznhhtovhushvewq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391091.2547686-84-193468568462491/AnsiballZ_user.py'
Oct 02 07:44:51 compute-0 sudo[48273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:44:52 compute-0 python3.9[48275]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 02 07:44:52 compute-0 useradd[48277]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Oct 02 07:44:52 compute-0 useradd[48277]: add 'openvswitch' to group 'hugetlbfs'
Oct 02 07:44:52 compute-0 useradd[48277]: add 'openvswitch' to shadow group 'hugetlbfs'
Oct 02 07:44:52 compute-0 sudo[48273]: pam_unix(sudo:session): session closed for user root
Oct 02 07:44:52 compute-0 sudo[48433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drfcbieabgvbztrkpmmtxqylcwpplplt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391092.4248483-104-246043014321313/AnsiballZ_setup.py'
Oct 02 07:44:52 compute-0 sudo[48433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:44:53 compute-0 python3.9[48435]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:44:53 compute-0 sudo[48433]: pam_unix(sudo:session): session closed for user root
Oct 02 07:44:53 compute-0 sudo[48517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvhbbjgjvnsxvyvypcmkhpwszonemanu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391092.4248483-104-246043014321313/AnsiballZ_dnf.py'
Oct 02 07:44:53 compute-0 sudo[48517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:44:54 compute-0 python3.9[48519]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 02 07:44:55 compute-0 sudo[48517]: pam_unix(sudo:session): session closed for user root
Oct 02 07:44:56 compute-0 sudo[48678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhvvziwnveglulkjrzqsqcfmoxrstdmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391095.8390899-132-218782905287569/AnsiballZ_dnf.py'
Oct 02 07:44:56 compute-0 sudo[48678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:44:56 compute-0 python3.9[48680]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:45:07 compute-0 kernel: SELinux:  Converting 2725 SID table entries...
Oct 02 07:45:07 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 02 07:45:07 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 02 07:45:07 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 02 07:45:07 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 02 07:45:07 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 02 07:45:07 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 02 07:45:07 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 02 07:45:07 compute-0 groupadd[48703]: group added to /etc/group: name=unbound, GID=993
Oct 02 07:45:07 compute-0 groupadd[48703]: group added to /etc/gshadow: name=unbound
Oct 02 07:45:07 compute-0 groupadd[48703]: new group: name=unbound, GID=993
Oct 02 07:45:08 compute-0 useradd[48710]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Oct 02 07:45:08 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct 02 07:45:08 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct 02 07:45:09 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 02 07:45:09 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 02 07:45:09 compute-0 systemd[1]: Reloading.
Oct 02 07:45:09 compute-0 systemd-rc-local-generator[49208]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:45:09 compute-0 systemd-sysv-generator[49211]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:45:10 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 02 07:45:10 compute-0 sudo[48678]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:10 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 02 07:45:10 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 02 07:45:10 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.001s CPU time.
Oct 02 07:45:10 compute-0 systemd[1]: run-rd429be8eb4c044af98297a9fde117f42.service: Deactivated successfully.
Oct 02 07:45:11 compute-0 sudo[49779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwswpmpasvbiagptoequhkfbpcfcgail ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391111.1968734-148-34602591698832/AnsiballZ_systemd.py'
Oct 02 07:45:11 compute-0 sudo[49779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:12 compute-0 python3.9[49781]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 02 07:45:13 compute-0 systemd[1]: Reloading.
Oct 02 07:45:13 compute-0 systemd-rc-local-generator[49812]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:45:13 compute-0 systemd-sysv-generator[49816]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:45:13 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Oct 02 07:45:13 compute-0 chown[49824]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct 02 07:45:13 compute-0 ovs-ctl[49829]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct 02 07:45:13 compute-0 ovs-ctl[49829]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct 02 07:45:13 compute-0 ovs-ctl[49829]: Starting ovsdb-server [  OK  ]
Oct 02 07:45:13 compute-0 ovs-vsctl[49878]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct 02 07:45:13 compute-0 ovs-vsctl[49894]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"66c4bca3-98aa-4361-8801-8722dd9a7888\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct 02 07:45:13 compute-0 ovs-ctl[49829]: Configuring Open vSwitch system IDs [  OK  ]
Oct 02 07:45:13 compute-0 ovs-vsctl[49903]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct 02 07:45:13 compute-0 ovs-ctl[49829]: Enabling remote OVSDB managers [  OK  ]
Oct 02 07:45:13 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Oct 02 07:45:13 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct 02 07:45:13 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct 02 07:45:13 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct 02 07:45:14 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Oct 02 07:45:14 compute-0 ovs-ctl[49947]: Inserting openvswitch module [  OK  ]
Oct 02 07:45:14 compute-0 ovs-ctl[49916]: Starting ovs-vswitchd [  OK  ]
Oct 02 07:45:14 compute-0 ovs-vsctl[49964]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct 02 07:45:14 compute-0 ovs-ctl[49916]: Enabling remote OVSDB managers [  OK  ]
Oct 02 07:45:14 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct 02 07:45:14 compute-0 systemd[1]: Starting Open vSwitch...
Oct 02 07:45:14 compute-0 systemd[1]: Finished Open vSwitch.
Oct 02 07:45:14 compute-0 sudo[49779]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:15 compute-0 python3.9[50116]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:45:16 compute-0 sudo[50266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxgbiuqlkjgypsqgqcowoiouqfgvmeet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391115.673447-184-204800807251888/AnsiballZ_sefcontext.py'
Oct 02 07:45:16 compute-0 sudo[50266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:16 compute-0 python3.9[50268]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct 02 07:45:17 compute-0 kernel: SELinux:  Converting 2739 SID table entries...
Oct 02 07:45:17 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 02 07:45:17 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 02 07:45:17 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 02 07:45:17 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 02 07:45:17 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 02 07:45:17 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 02 07:45:17 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 02 07:45:18 compute-0 sudo[50266]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:18 compute-0 python3.9[50423]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:45:19 compute-0 sudo[50579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atibpoamcboefmejdnylxlvmqzkruyhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391119.3301232-220-255798214937942/AnsiballZ_dnf.py'
Oct 02 07:45:19 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct 02 07:45:19 compute-0 sudo[50579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:19 compute-0 python3.9[50581]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:45:21 compute-0 sudo[50579]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:21 compute-0 sudo[50732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzmkzjirknycacotxwnxjlxylbsmfnkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391121.3571634-236-61270955811997/AnsiballZ_command.py'
Oct 02 07:45:21 compute-0 sudo[50732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:21 compute-0 python3.9[50734]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:45:22 compute-0 sudo[50732]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:23 compute-0 sudo[51019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjispslwapyzetahhitmfbiuryueptpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391122.9845245-252-42003677612316/AnsiballZ_file.py'
Oct 02 07:45:23 compute-0 sudo[51019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:23 compute-0 python3.9[51021]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 02 07:45:23 compute-0 sudo[51019]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:24 compute-0 python3.9[51171]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:45:25 compute-0 sudo[51323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eztbgeeaftrjpsavvnbxjvufpjlyriez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391124.744524-284-128330414843812/AnsiballZ_dnf.py'
Oct 02 07:45:25 compute-0 sudo[51323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:25 compute-0 python3.9[51325]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:45:27 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 02 07:45:27 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 02 07:45:27 compute-0 systemd[1]: Reloading.
Oct 02 07:45:27 compute-0 systemd-sysv-generator[51365]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:45:27 compute-0 systemd-rc-local-generator[51362]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:45:27 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 02 07:45:27 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 02 07:45:27 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 02 07:45:27 compute-0 systemd[1]: run-r1497cb5b470a4c72800b9c59e19d6bea.service: Deactivated successfully.
Oct 02 07:45:27 compute-0 sudo[51323]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:28 compute-0 sudo[51640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpowmyiqldbhmudianwxafhltlkqrydp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391128.1417513-300-199592155188468/AnsiballZ_systemd.py'
Oct 02 07:45:28 compute-0 sudo[51640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:28 compute-0 python3.9[51642]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:45:28 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 02 07:45:28 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Oct 02 07:45:28 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Oct 02 07:45:28 compute-0 systemd[1]: Stopping Network Manager...
Oct 02 07:45:28 compute-0 NetworkManager[3947]: <info>  [1759391128.8793] caught SIGTERM, shutting down normally.
Oct 02 07:45:28 compute-0 NetworkManager[3947]: <info>  [1759391128.8809] dhcp4 (eth0): canceled DHCP transaction
Oct 02 07:45:28 compute-0 NetworkManager[3947]: <info>  [1759391128.8809] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 02 07:45:28 compute-0 NetworkManager[3947]: <info>  [1759391128.8809] dhcp4 (eth0): state changed no lease
Oct 02 07:45:28 compute-0 NetworkManager[3947]: <info>  [1759391128.8812] manager: NetworkManager state is now CONNECTED_SITE
Oct 02 07:45:28 compute-0 NetworkManager[3947]: <info>  [1759391128.8881] exiting (success)
Oct 02 07:45:28 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 02 07:45:28 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 02 07:45:28 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 02 07:45:28 compute-0 systemd[1]: Stopped Network Manager.
Oct 02 07:45:28 compute-0 systemd[1]: NetworkManager.service: Consumed 11.762s CPU time, 4.1M memory peak, read 0B from disk, written 34.5K to disk.
Oct 02 07:45:28 compute-0 systemd[1]: Starting Network Manager...
Oct 02 07:45:28 compute-0 NetworkManager[51654]: <info>  [1759391128.9583] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:f022b9c3-1b0c-4cc3-9fcc-f643153c9b0a)
Oct 02 07:45:28 compute-0 NetworkManager[51654]: <info>  [1759391128.9585] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 02 07:45:28 compute-0 NetworkManager[51654]: <info>  [1759391128.9671] manager[0x564dcdfa7090]: monitoring kernel firmware directory '/lib/firmware'.
Oct 02 07:45:28 compute-0 systemd[1]: Starting Hostname Service...
Oct 02 07:45:29 compute-0 systemd[1]: Started Hostname Service.
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0491] hostname: hostname: using hostnamed
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0492] hostname: static hostname changed from (none) to "compute-0"
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0496] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0500] manager[0x564dcdfa7090]: rfkill: Wi-Fi hardware radio set enabled
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0500] manager[0x564dcdfa7090]: rfkill: WWAN hardware radio set enabled
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0519] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0528] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0528] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0529] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0529] manager: Networking is enabled by state file
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0531] settings: Loaded settings plugin: keyfile (internal)
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0534] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0559] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0567] dhcp: init: Using DHCP client 'internal'
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0569] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0573] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0578] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0584] device (lo): Activation: starting connection 'lo' (77cf1f15-4a84-4c7c-ae0f-bf80f6a18c78)
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0589] device (eth0): carrier: link connected
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0593] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0597] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0597] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0602] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0608] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0614] device (eth1): carrier: link connected
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0618] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0623] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (cfce9517-0ffb-5f79-8907-2e072b5156ab) (indicated)
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0624] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0629] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0635] device (eth1): Activation: starting connection 'ci-private-network' (cfce9517-0ffb-5f79-8907-2e072b5156ab)
Oct 02 07:45:29 compute-0 systemd[1]: Started Network Manager.
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0643] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0651] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0653] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0654] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0656] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0658] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0660] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0663] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0666] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0683] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0686] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0703] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0715] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0721] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0724] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0728] device (lo): Activation: successful, device activated.
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0761] dhcp4 (eth0): state changed new lease, address=38.102.83.73
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0767] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 02 07:45:29 compute-0 systemd[1]: Starting Network Manager Wait Online...
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0825] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0830] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0837] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0840] manager: NetworkManager state is now CONNECTED_LOCAL
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0844] device (eth1): Activation: successful, device activated.
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0858] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0860] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0863] manager: NetworkManager state is now CONNECTED_SITE
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0867] device (eth0): Activation: successful, device activated.
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0874] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 02 07:45:29 compute-0 NetworkManager[51654]: <info>  [1759391129.0901] manager: startup complete
Oct 02 07:45:29 compute-0 sudo[51640]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:29 compute-0 systemd[1]: Finished Network Manager Wait Online.
Oct 02 07:45:29 compute-0 sudo[51866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqgkjppihenkvtcfprxegoidbptakbbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391129.325114-316-208832273424862/AnsiballZ_dnf.py'
Oct 02 07:45:29 compute-0 sudo[51866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:29 compute-0 python3.9[51868]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:45:34 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 02 07:45:34 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 02 07:45:34 compute-0 systemd[1]: Reloading.
Oct 02 07:45:34 compute-0 systemd-sysv-generator[51924]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:45:34 compute-0 systemd-rc-local-generator[51921]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:45:34 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 02 07:45:35 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 02 07:45:35 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 02 07:45:35 compute-0 systemd[1]: run-r318fee773d114d4e9fc20cc947d7f012.service: Deactivated successfully.
Oct 02 07:45:35 compute-0 sudo[51866]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:36 compute-0 sudo[52328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmnzmjaitybhwncilthqrtmcejoodrtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391136.3702435-340-60116746547497/AnsiballZ_stat.py'
Oct 02 07:45:36 compute-0 sudo[52328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:36 compute-0 python3.9[52330]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:45:36 compute-0 sudo[52328]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:37 compute-0 sudo[52480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gygxxqpfbigyuljupbqprewcvzblyeiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391137.1647294-358-263183430512620/AnsiballZ_ini_file.py'
Oct 02 07:45:37 compute-0 sudo[52480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:37 compute-0 python3.9[52482]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:45:37 compute-0 sudo[52480]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:38 compute-0 sudo[52634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xemmddjftxwhvtbgzwhoonmvcnjqxuqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391138.1475537-378-47855374438552/AnsiballZ_ini_file.py'
Oct 02 07:45:38 compute-0 sudo[52634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:38 compute-0 python3.9[52636]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:45:38 compute-0 sudo[52634]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:39 compute-0 sudo[52786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuwdrqipdouzdgwwpviruciiinqbuuef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391138.8305116-378-63242661807093/AnsiballZ_ini_file.py'
Oct 02 07:45:39 compute-0 sudo[52786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:39 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 02 07:45:39 compute-0 python3.9[52788]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:45:39 compute-0 sudo[52786]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:39 compute-0 sudo[52938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muvvwdbzcqbkuzpipvjmrzzzlxkxbian ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391139.5776684-408-72057151075464/AnsiballZ_ini_file.py'
Oct 02 07:45:39 compute-0 sudo[52938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:40 compute-0 python3.9[52940]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:45:40 compute-0 sudo[52938]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:40 compute-0 sudo[53090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dulqixazwgxpfgmpqnkokjpkmgmhwvro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391140.294227-408-210545496229055/AnsiballZ_ini_file.py'
Oct 02 07:45:40 compute-0 sudo[53090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:40 compute-0 python3.9[53092]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:45:40 compute-0 sudo[53090]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:41 compute-0 sudo[53242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjzwsxivfruugniyyjzutqbwozuiwxde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391141.0535393-438-99418583883211/AnsiballZ_stat.py'
Oct 02 07:45:41 compute-0 sudo[53242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:41 compute-0 python3.9[53244]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:45:41 compute-0 sudo[53242]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:42 compute-0 sudo[53365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuwrlnbkesejslkybkzwomynnvztqtyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391141.0535393-438-99418583883211/AnsiballZ_copy.py'
Oct 02 07:45:42 compute-0 sudo[53365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:42 compute-0 python3.9[53367]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391141.0535393-438-99418583883211/.source _original_basename=.xxjtz0g1 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:45:42 compute-0 sudo[53365]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:43 compute-0 sudo[53517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlyouzbhsfnitmxakqdatfjgocxlukip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391142.7141027-468-15911674507359/AnsiballZ_file.py'
Oct 02 07:45:43 compute-0 sudo[53517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:43 compute-0 python3.9[53519]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:45:43 compute-0 sudo[53517]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:44 compute-0 sudo[53669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuehjlvobiyvwlfaccsxapoksfbcfwhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391143.4947195-484-175561746232702/AnsiballZ_edpm_os_net_config_mappings.py'
Oct 02 07:45:44 compute-0 sudo[53669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:44 compute-0 python3.9[53671]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct 02 07:45:44 compute-0 sudo[53669]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:44 compute-0 sudo[53821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-salgaxglvtjbpoasatgptnspeekioghh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391144.5699146-502-170202176749439/AnsiballZ_file.py'
Oct 02 07:45:44 compute-0 sudo[53821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:45 compute-0 python3.9[53823]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:45:45 compute-0 sudo[53821]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:45 compute-0 sudo[53973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chyklksgxxlvplbubeeurbecvuitkncf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391145.5746214-522-134268021258707/AnsiballZ_stat.py'
Oct 02 07:45:45 compute-0 sudo[53973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:46 compute-0 sudo[53973]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:46 compute-0 sudo[54096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mphzlbhvqgzchxtjsvqkteeqnjnjiffi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391145.5746214-522-134268021258707/AnsiballZ_copy.py'
Oct 02 07:45:46 compute-0 sudo[54096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:46 compute-0 sudo[54096]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:47 compute-0 sudo[54248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrrfeasoqcawhfncpoozjcfdvpvbrwmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391146.9848511-552-223393593835934/AnsiballZ_slurp.py'
Oct 02 07:45:47 compute-0 sudo[54248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:47 compute-0 python3.9[54250]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct 02 07:45:47 compute-0 sudo[54248]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:48 compute-0 sudo[54423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lflrndsbvpoxslqakpdxabymbzssrvgf ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391147.979802-570-220834116131700/async_wrapper.py j536850666458 300 /home/zuul/.ansible/tmp/ansible-tmp-1759391147.979802-570-220834116131700/AnsiballZ_edpm_os_net_config.py _'
Oct 02 07:45:48 compute-0 sudo[54423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:48 compute-0 ansible-async_wrapper.py[54425]: Invoked with j536850666458 300 /home/zuul/.ansible/tmp/ansible-tmp-1759391147.979802-570-220834116131700/AnsiballZ_edpm_os_net_config.py _
Oct 02 07:45:49 compute-0 ansible-async_wrapper.py[54428]: Starting module and watcher
Oct 02 07:45:49 compute-0 ansible-async_wrapper.py[54428]: Start watching 54429 (300)
Oct 02 07:45:49 compute-0 ansible-async_wrapper.py[54429]: Start module (54429)
Oct 02 07:45:49 compute-0 ansible-async_wrapper.py[54425]: Return async_wrapper task started.
Oct 02 07:45:49 compute-0 sudo[54423]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:49 compute-0 python3.9[54430]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct 02 07:45:49 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct 02 07:45:49 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct 02 07:45:49 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct 02 07:45:49 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct 02 07:45:49 compute-0 kernel: cfg80211: failed to load regulatory.db
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.1130] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.1152] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.1931] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.1935] audit: op="connection-add" uuid="1f3915d5-f87f-4951-91e2-bc8e10d72aa6" name="br-ex-br" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.1967] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.1970] audit: op="connection-add" uuid="41e658fa-cb92-4b61-a6d5-95ee045e50e3" name="br-ex-port" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.1997] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2000] audit: op="connection-add" uuid="8903b57a-1c8d-43d6-9c46-bca858d0adcd" name="eth1-port" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2027] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2030] audit: op="connection-add" uuid="d37eff35-2bd3-4352-9a4b-5fd8c575ecc2" name="vlan20-port" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2057] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2060] audit: op="connection-add" uuid="03a30238-6362-4e7f-95d6-d48ceb4f5dfd" name="vlan21-port" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2083] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2085] audit: op="connection-add" uuid="86b03beb-370a-4b77-87c4-0b5d5ab66b75" name="vlan22-port" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2127] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2157] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2162] audit: op="connection-add" uuid="989bb158-53af-487a-9d21-493d498554bc" name="br-ex-if" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2232] audit: op="connection-update" uuid="cfce9517-0ffb-5f79-8907-2e072b5156ab" name="ci-private-network" args="connection.port-type,connection.controller,connection.slave-type,connection.timestamp,connection.master,ipv6.method,ipv6.addr-gen-mode,ipv6.dns,ipv6.routes,ipv6.addresses,ipv6.routing-rules,ovs-external-ids.data,ipv4.method,ipv4.never-default,ipv4.dns,ipv4.routes,ipv4.addresses,ipv4.routing-rules,ovs-interface.type" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2269] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2272] audit: op="connection-add" uuid="8e2950bd-91d8-4f4e-a65a-b72e11e30d4f" name="vlan20-if" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2308] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2310] audit: op="connection-add" uuid="a629105c-10cc-4184-9396-d009a6281d2e" name="vlan21-if" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2345] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2347] audit: op="connection-add" uuid="65ecd521-d6e5-4d4f-9d4b-6dc7a6e2ca7b" name="vlan22-if" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2370] audit: op="connection-delete" uuid="40b6ce4d-1768-3626-b020-83475d2a4193" name="Wired connection 1" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2392] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2408] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2417] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (1f3915d5-f87f-4951-91e2-bc8e10d72aa6)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2418] audit: op="connection-activate" uuid="1f3915d5-f87f-4951-91e2-bc8e10d72aa6" name="br-ex-br" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2422] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2436] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2442] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (41e658fa-cb92-4b61-a6d5-95ee045e50e3)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2447] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2458] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2468] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (8903b57a-1c8d-43d6-9c46-bca858d0adcd)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2471] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2485] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2492] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (d37eff35-2bd3-4352-9a4b-5fd8c575ecc2)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2494] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2507] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2515] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (03a30238-6362-4e7f-95d6-d48ceb4f5dfd)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2518] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2530] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2537] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (86b03beb-370a-4b77-87c4-0b5d5ab66b75)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2538] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2542] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2545] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2556] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2564] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2571] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (989bb158-53af-487a-9d21-493d498554bc)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2572] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2575] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2579] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2580] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2581] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2598] device (eth1): disconnecting for new activation request.
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2598] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2601] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2603] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2604] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2607] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2614] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2619] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (8e2950bd-91d8-4f4e-a65a-b72e11e30d4f)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2620] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2623] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2625] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2626] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2629] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2634] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2638] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (a629105c-10cc-4184-9396-d009a6281d2e)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2639] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2642] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2644] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2645] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2648] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2654] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2658] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (65ecd521-d6e5-4d4f-9d4b-6dc7a6e2ca7b)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2659] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2662] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2663] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2665] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2666] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2685] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2688] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2692] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2695] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2706] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2713] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2718] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2723] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2725] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 kernel: ovs-system: entered promiscuous mode
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2742] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2746] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2749] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2751] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2759] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2766] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 kernel: Timeout policy base is empty
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2770] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2773] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2780] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 systemd-udevd[54436]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2786] dhcp4 (eth0): canceled DHCP transaction
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2786] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2787] dhcp4 (eth0): state changed no lease
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2788] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2801] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2805] audit: op="device-reapply" interface="eth1" ifindex=3 pid=54431 uid=0 result="fail" reason="Device is not activated"
Oct 02 07:45:51 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2854] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2858] dhcp4 (eth0): state changed new lease, address=38.102.83.73
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2909] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2918] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2924] device (eth1): disconnecting for new activation request.
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2925] audit: op="connection-activate" uuid="cfce9517-0ffb-5f79-8907-2e072b5156ab" name="ci-private-network" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.2969] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3080] device (eth1): Activation: starting connection 'ci-private-network' (cfce9517-0ffb-5f79-8907-2e072b5156ab)
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3101] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3104] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3109] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54431 uid=0 result="success"
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3110] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3111] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3203] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3205] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3206] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3207] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3210] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3218] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3223] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3229] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3233] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3237] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3242] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3245] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3248] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct 02 07:45:51 compute-0 kernel: br-ex: entered promiscuous mode
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3253] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3255] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3258] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3261] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3266] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3269] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3322] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3323] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3328] device (eth1): Activation: successful, device activated.
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3381] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3390] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 kernel: vlan22: entered promiscuous mode
Oct 02 07:45:51 compute-0 systemd-udevd[54437]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 07:45:51 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3454] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3458] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3465] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 02 07:45:51 compute-0 kernel: vlan20: entered promiscuous mode
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3519] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3530] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 kernel: vlan21: entered promiscuous mode
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3566] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3568] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3583] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3598] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3609] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3648] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3650] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3657] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3675] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3683] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3717] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3718] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 02 07:45:51 compute-0 NetworkManager[51654]: <info>  [1759391151.3723] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 02 07:45:52 compute-0 NetworkManager[51654]: <info>  [1759391152.4831] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54431 uid=0 result="success"
Oct 02 07:45:52 compute-0 NetworkManager[51654]: <info>  [1759391152.6651] checkpoint[0x564dcdf7c950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct 02 07:45:52 compute-0 NetworkManager[51654]: <info>  [1759391152.6655] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54431 uid=0 result="success"
Oct 02 07:45:52 compute-0 sudo[54762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysqzexovupezpvcicommhzwtkledsjwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391152.1837862-570-95192040200424/AnsiballZ_async_status.py'
Oct 02 07:45:52 compute-0 sudo[54762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:53 compute-0 python3.9[54764]: ansible-ansible.legacy.async_status Invoked with jid=j536850666458.54425 mode=status _async_dir=/root/.ansible_async
Oct 02 07:45:53 compute-0 NetworkManager[51654]: <info>  [1759391153.0204] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54431 uid=0 result="success"
Oct 02 07:45:53 compute-0 NetworkManager[51654]: <info>  [1759391153.0245] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54431 uid=0 result="success"
Oct 02 07:45:53 compute-0 sudo[54762]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:53 compute-0 NetworkManager[51654]: <info>  [1759391153.2557] audit: op="networking-control" arg="global-dns-configuration" pid=54431 uid=0 result="success"
Oct 02 07:45:53 compute-0 NetworkManager[51654]: <info>  [1759391153.2584] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct 02 07:45:53 compute-0 NetworkManager[51654]: <info>  [1759391153.2617] audit: op="networking-control" arg="global-dns-configuration" pid=54431 uid=0 result="success"
Oct 02 07:45:53 compute-0 NetworkManager[51654]: <info>  [1759391153.2643] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54431 uid=0 result="success"
Oct 02 07:45:53 compute-0 NetworkManager[51654]: <info>  [1759391153.4254] checkpoint[0x564dcdf7ca20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct 02 07:45:53 compute-0 NetworkManager[51654]: <info>  [1759391153.4258] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54431 uid=0 result="success"
Oct 02 07:45:53 compute-0 ansible-async_wrapper.py[54429]: Module complete (54429)
Oct 02 07:45:54 compute-0 ansible-async_wrapper.py[54428]: Done in kid B.
Oct 02 07:45:56 compute-0 sudo[54868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmwulvtsyzjwscyphdwfoyazwdqaqcpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391152.1837862-570-95192040200424/AnsiballZ_async_status.py'
Oct 02 07:45:56 compute-0 sudo[54868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:56 compute-0 python3.9[54870]: ansible-ansible.legacy.async_status Invoked with jid=j536850666458.54425 mode=status _async_dir=/root/.ansible_async
Oct 02 07:45:56 compute-0 sudo[54868]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:56 compute-0 sudo[54967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myggghvhldcnjkewqxfizwqfvbuxdgek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391152.1837862-570-95192040200424/AnsiballZ_async_status.py'
Oct 02 07:45:56 compute-0 sudo[54967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:57 compute-0 python3.9[54969]: ansible-ansible.legacy.async_status Invoked with jid=j536850666458.54425 mode=cleanup _async_dir=/root/.ansible_async
Oct 02 07:45:57 compute-0 sudo[54967]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:57 compute-0 sudo[55119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quvofnuygdrjiropibtbgtnpfqsyipkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391157.4961948-624-194145974888545/AnsiballZ_stat.py'
Oct 02 07:45:57 compute-0 sudo[55119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:58 compute-0 python3.9[55121]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:45:58 compute-0 sudo[55119]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:58 compute-0 sudo[55242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkbzcmnilwxsinchzvywofzaorumxipi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391157.4961948-624-194145974888545/AnsiballZ_copy.py'
Oct 02 07:45:58 compute-0 sudo[55242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:58 compute-0 python3.9[55244]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391157.4961948-624-194145974888545/.source.returncode _original_basename=.3d3v5ktr follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:45:58 compute-0 sudo[55242]: pam_unix(sudo:session): session closed for user root
Oct 02 07:45:59 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 02 07:45:59 compute-0 sudo[55396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkqjrreigwjmokodliweanaynhrzzvap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391159.0649185-656-247737543998334/AnsiballZ_stat.py'
Oct 02 07:45:59 compute-0 sudo[55396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:45:59 compute-0 python3.9[55398]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:45:59 compute-0 sudo[55396]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:00 compute-0 sudo[55520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tebniwgwthvysqfbboepfrciprrjmfyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391159.0649185-656-247737543998334/AnsiballZ_copy.py'
Oct 02 07:46:00 compute-0 sudo[55520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:00 compute-0 python3.9[55522]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391159.0649185-656-247737543998334/.source.cfg _original_basename=.sttei77l follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:46:00 compute-0 sudo[55520]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:00 compute-0 sudo[55672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usdtgiyuoymockcrjxjmfjdxecuwbfal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391160.4886544-686-214285716573527/AnsiballZ_systemd.py'
Oct 02 07:46:00 compute-0 sudo[55672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:01 compute-0 python3.9[55674]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:46:01 compute-0 systemd[1]: Reloading Network Manager...
Oct 02 07:46:01 compute-0 NetworkManager[51654]: <info>  [1759391161.2641] audit: op="reload" arg="0" pid=55678 uid=0 result="success"
Oct 02 07:46:01 compute-0 NetworkManager[51654]: <info>  [1759391161.2652] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct 02 07:46:01 compute-0 systemd[1]: Reloaded Network Manager.
Oct 02 07:46:01 compute-0 sudo[55672]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:01 compute-0 sshd-session[47658]: Connection closed by 192.168.122.30 port 45008
Oct 02 07:46:01 compute-0 sshd-session[47655]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:46:01 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Oct 02 07:46:01 compute-0 systemd[1]: session-12.scope: Consumed 53.236s CPU time.
Oct 02 07:46:01 compute-0 systemd-logind[827]: Session 12 logged out. Waiting for processes to exit.
Oct 02 07:46:01 compute-0 systemd-logind[827]: Removed session 12.
Oct 02 07:46:07 compute-0 sshd-session[55709]: Accepted publickey for zuul from 192.168.122.30 port 60964 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:46:07 compute-0 systemd-logind[827]: New session 13 of user zuul.
Oct 02 07:46:07 compute-0 systemd[1]: Started Session 13 of User zuul.
Oct 02 07:46:07 compute-0 sshd-session[55709]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:46:08 compute-0 python3.9[55862]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:46:09 compute-0 python3.9[56017]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:46:11 compute-0 python3.9[56206]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:46:11 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 02 07:46:11 compute-0 sshd-session[55712]: Connection closed by 192.168.122.30 port 60964
Oct 02 07:46:11 compute-0 sshd-session[55709]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:46:11 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Oct 02 07:46:11 compute-0 systemd[1]: session-13.scope: Consumed 2.703s CPU time.
Oct 02 07:46:11 compute-0 systemd-logind[827]: Session 13 logged out. Waiting for processes to exit.
Oct 02 07:46:11 compute-0 systemd-logind[827]: Removed session 13.
Oct 02 07:46:17 compute-0 sshd-session[56235]: Accepted publickey for zuul from 192.168.122.30 port 38180 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:46:17 compute-0 systemd-logind[827]: New session 14 of user zuul.
Oct 02 07:46:17 compute-0 systemd[1]: Started Session 14 of User zuul.
Oct 02 07:46:17 compute-0 sshd-session[56235]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:46:18 compute-0 python3.9[56388]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:46:19 compute-0 python3.9[56543]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:46:20 compute-0 sudo[56697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdkdwykwjmtaouczskuoexsfjijtktpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391180.0185204-60-27351434422314/AnsiballZ_setup.py'
Oct 02 07:46:20 compute-0 sudo[56697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:20 compute-0 python3.9[56699]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:46:20 compute-0 sudo[56697]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:21 compute-0 sudo[56781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afbjsdavegsjxqiqegjqkjdzciixmotx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391180.0185204-60-27351434422314/AnsiballZ_dnf.py'
Oct 02 07:46:21 compute-0 sudo[56781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:21 compute-0 python3.9[56783]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:46:22 compute-0 sudo[56781]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:23 compute-0 sudo[56935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wospjucjdrhyrthjcracwwsnpkiozxig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391182.9796863-84-8876817370654/AnsiballZ_setup.py'
Oct 02 07:46:23 compute-0 sudo[56935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:23 compute-0 python3.9[56937]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:46:23 compute-0 sudo[56935]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:24 compute-0 sudo[57126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdervfatlxicmojurorkivdtmholasxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391184.1935704-106-279366216980262/AnsiballZ_file.py'
Oct 02 07:46:24 compute-0 sudo[57126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:24 compute-0 python3.9[57128]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:46:24 compute-0 sudo[57126]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:25 compute-0 sudo[57278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwvfjfnfregdkgnpczxasfxuktcweyew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391184.9625926-122-159340065668305/AnsiballZ_command.py'
Oct 02 07:46:25 compute-0 sudo[57278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:25 compute-0 python3.9[57280]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:46:25 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:46:25 compute-0 sudo[57278]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:26 compute-0 sudo[57441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzflrvcfbdlmcphnitfqobbcxhdonbgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391186.0024297-138-207346260267003/AnsiballZ_stat.py'
Oct 02 07:46:26 compute-0 sudo[57441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:26 compute-0 python3.9[57443]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:46:26 compute-0 sudo[57441]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:26 compute-0 sudo[57519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjutffhhkiekyjgkspepfqptelhdrybe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391186.0024297-138-207346260267003/AnsiballZ_file.py'
Oct 02 07:46:26 compute-0 sudo[57519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:27 compute-0 python3.9[57521]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:46:27 compute-0 sudo[57519]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:27 compute-0 sudo[57671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exfttvgboneqrfuggpdtovitmcfafjkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391187.4596093-162-160326035919762/AnsiballZ_stat.py'
Oct 02 07:46:27 compute-0 sudo[57671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:28 compute-0 python3.9[57673]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:46:28 compute-0 sudo[57671]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:28 compute-0 sudo[57749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dflcldcnpeziplnyeeqszqpzkkzlelvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391187.4596093-162-160326035919762/AnsiballZ_file.py'
Oct 02 07:46:28 compute-0 sudo[57749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:28 compute-0 python3.9[57751]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:46:28 compute-0 sudo[57749]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:29 compute-0 sudo[57901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwjdkdqogqsuiqjipaksjceziecirytn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391188.754004-188-182656640821905/AnsiballZ_ini_file.py'
Oct 02 07:46:29 compute-0 sudo[57901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:29 compute-0 python3.9[57903]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:46:29 compute-0 sudo[57901]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:29 compute-0 sudo[58053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqwpbwiorkztbvtfaxhpzqxiqaopvcij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391189.6297464-188-268385439208603/AnsiballZ_ini_file.py'
Oct 02 07:46:29 compute-0 sudo[58053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:30 compute-0 python3.9[58055]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:46:30 compute-0 sudo[58053]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:30 compute-0 sudo[58205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otvrudxjbmchkswergvrvsedgajsqtlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391190.2690654-188-179580484981771/AnsiballZ_ini_file.py'
Oct 02 07:46:30 compute-0 sudo[58205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:30 compute-0 python3.9[58207]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:46:30 compute-0 sudo[58205]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:31 compute-0 sudo[58357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmiuxzcpukhbhlxvibxwnctxsnlhzjul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391191.0533392-188-261849927723320/AnsiballZ_ini_file.py'
Oct 02 07:46:31 compute-0 sudo[58357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:31 compute-0 python3.9[58359]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:46:31 compute-0 sudo[58357]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:32 compute-0 sudo[58509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfoqadplzmbtokyappirthyaeyningto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391191.9422922-250-19775066971625/AnsiballZ_dnf.py'
Oct 02 07:46:32 compute-0 sudo[58509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:32 compute-0 python3.9[58511]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:46:33 compute-0 sudo[58509]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:34 compute-0 sudo[58662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvpkrqfkvpeaxbfrdirmcsnjvatjtinh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391194.2228267-272-49493909220427/AnsiballZ_setup.py'
Oct 02 07:46:34 compute-0 sudo[58662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:34 compute-0 python3.9[58664]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:46:34 compute-0 sudo[58662]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:35 compute-0 sudo[58816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqbqphbmwfgelfkwmaanzwxuqhyiyvma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391195.136732-288-114573541117428/AnsiballZ_stat.py'
Oct 02 07:46:35 compute-0 sudo[58816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:35 compute-0 python3.9[58818]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:46:35 compute-0 sudo[58816]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:36 compute-0 sudo[58968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhnzqcuxxgnutlwxzaqtvoopoiiytprf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391196.1492057-306-268813505847345/AnsiballZ_stat.py'
Oct 02 07:46:36 compute-0 sudo[58968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:36 compute-0 python3.9[58970]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:46:36 compute-0 sudo[58968]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:37 compute-0 sudo[59120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkuxihekdrnhwwxvlnhknsbhysyeklfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391197.0376594-326-215844890732785/AnsiballZ_service_facts.py'
Oct 02 07:46:37 compute-0 sudo[59120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:37 compute-0 python3.9[59122]: ansible-service_facts Invoked
Oct 02 07:46:37 compute-0 network[59139]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 02 07:46:37 compute-0 network[59140]: 'network-scripts' will be removed from distribution in near future.
Oct 02 07:46:37 compute-0 network[59141]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 02 07:46:41 compute-0 sudo[59120]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:43 compute-0 sudo[59426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywnqeovqnjuvsisljqasgthfpqevtcmo ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1759391202.908044-352-53693425547779/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1759391202.908044-352-53693425547779/args'
Oct 02 07:46:43 compute-0 sudo[59426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:43 compute-0 sudo[59426]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:44 compute-0 sudo[59593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qosexuizhjplgazzgeionxadotgwjeom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391203.728765-374-221806214534096/AnsiballZ_dnf.py'
Oct 02 07:46:44 compute-0 sudo[59593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:44 compute-0 python3.9[59595]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:46:45 compute-0 sudo[59593]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:46 compute-0 sudo[59746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aipltvhxxxzgbzchxwhuooqfdtjusbmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391205.8443522-400-254767214972135/AnsiballZ_package_facts.py'
Oct 02 07:46:46 compute-0 sudo[59746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:46 compute-0 python3.9[59748]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct 02 07:46:47 compute-0 sudo[59746]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:48 compute-0 sudo[59898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zygkfipmvdaqgirwarkolzhyumacmpru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391207.7829137-420-178747966612006/AnsiballZ_stat.py'
Oct 02 07:46:48 compute-0 sudo[59898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:48 compute-0 python3.9[59900]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:46:48 compute-0 sudo[59898]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:49 compute-0 sudo[60023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amkhmyxrpexasyvhhspvnkdxedfpsmbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391207.7829137-420-178747966612006/AnsiballZ_copy.py'
Oct 02 07:46:49 compute-0 sudo[60023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:49 compute-0 python3.9[60025]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391207.7829137-420-178747966612006/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:46:49 compute-0 sudo[60023]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:49 compute-0 sudo[60177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nobbfywzbpoydgehbcbmpqbwtqlzlrxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391209.519046-450-127835821122336/AnsiballZ_stat.py'
Oct 02 07:46:49 compute-0 sudo[60177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:50 compute-0 python3.9[60179]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:46:50 compute-0 sudo[60177]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:50 compute-0 sudo[60302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjytgjazigodzkufconygajybvnkejfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391209.519046-450-127835821122336/AnsiballZ_copy.py'
Oct 02 07:46:50 compute-0 sudo[60302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:50 compute-0 python3.9[60304]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391209.519046-450-127835821122336/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:46:50 compute-0 sudo[60302]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:51 compute-0 sudo[60456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axcfoivovbcshjranfpgmdjbyfbbszpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391211.46791-492-46993055982749/AnsiballZ_lineinfile.py'
Oct 02 07:46:51 compute-0 sudo[60456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:52 compute-0 python3.9[60458]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:46:52 compute-0 sudo[60456]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:53 compute-0 sudo[60610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fewcgarmnacqzslklczvsuxcfgzhflin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391212.9856374-522-128926851509439/AnsiballZ_setup.py'
Oct 02 07:46:53 compute-0 sudo[60610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:53 compute-0 python3.9[60612]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:46:53 compute-0 sudo[60610]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:54 compute-0 sudo[60694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-someandkuhrzdiziqrwnplooiujpkpvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391212.9856374-522-128926851509439/AnsiballZ_systemd.py'
Oct 02 07:46:54 compute-0 sudo[60694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:54 compute-0 python3.9[60696]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:46:54 compute-0 sudo[60694]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:55 compute-0 sudo[60848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcdsdbpsjrwhystthcthorqlaenehlrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391215.538399-554-267722245773190/AnsiballZ_setup.py'
Oct 02 07:46:55 compute-0 sudo[60848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:56 compute-0 python3.9[60850]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:46:56 compute-0 sudo[60848]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:56 compute-0 sudo[60932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceozdkhlcqhsbifrrborinpaavzzgfnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391215.538399-554-267722245773190/AnsiballZ_systemd.py'
Oct 02 07:46:56 compute-0 sudo[60932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:46:57 compute-0 python3.9[60934]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:46:57 compute-0 chronyd[835]: chronyd exiting
Oct 02 07:46:57 compute-0 systemd[1]: Stopping NTP client/server...
Oct 02 07:46:57 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Oct 02 07:46:57 compute-0 systemd[1]: Stopped NTP client/server.
Oct 02 07:46:57 compute-0 systemd[1]: Starting NTP client/server...
Oct 02 07:46:57 compute-0 chronyd[60942]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 02 07:46:57 compute-0 chronyd[60942]: Frequency 9.178 +/- 0.190 ppm read from /var/lib/chrony/drift
Oct 02 07:46:57 compute-0 chronyd[60942]: Loaded seccomp filter (level 2)
Oct 02 07:46:57 compute-0 systemd[1]: Started NTP client/server.
Oct 02 07:46:57 compute-0 sudo[60932]: pam_unix(sudo:session): session closed for user root
Oct 02 07:46:57 compute-0 sshd-session[56238]: Connection closed by 192.168.122.30 port 38180
Oct 02 07:46:57 compute-0 sshd-session[56235]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:46:57 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Oct 02 07:46:57 compute-0 systemd[1]: session-14.scope: Consumed 27.950s CPU time.
Oct 02 07:46:57 compute-0 systemd-logind[827]: Session 14 logged out. Waiting for processes to exit.
Oct 02 07:46:57 compute-0 systemd-logind[827]: Removed session 14.
Oct 02 07:47:03 compute-0 sshd-session[60968]: Accepted publickey for zuul from 192.168.122.30 port 55664 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:47:03 compute-0 systemd-logind[827]: New session 15 of user zuul.
Oct 02 07:47:03 compute-0 systemd[1]: Started Session 15 of User zuul.
Oct 02 07:47:03 compute-0 sshd-session[60968]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:47:04 compute-0 python3.9[61121]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:47:05 compute-0 sudo[61275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqmmgagkzgxvepkpxnnfuoqkimayabet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391224.9898794-46-24424571430384/AnsiballZ_file.py'
Oct 02 07:47:05 compute-0 sudo[61275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:05 compute-0 python3.9[61277]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:47:05 compute-0 sudo[61275]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:06 compute-0 sudo[61450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdyztlciezrqhdmrtaqhvjxptplelfjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391225.9934351-62-91678976609166/AnsiballZ_stat.py'
Oct 02 07:47:06 compute-0 sudo[61450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:06 compute-0 python3.9[61452]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:47:06 compute-0 sudo[61450]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:07 compute-0 sudo[61528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmezwqdjavuizopffwumwzwktfijsmgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391225.9934351-62-91678976609166/AnsiballZ_file.py'
Oct 02 07:47:07 compute-0 sudo[61528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:07 compute-0 python3.9[61530]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.9c8rwqkl recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:47:07 compute-0 sudo[61528]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:08 compute-0 sudo[61680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cliaxgjpbleylqhiwrnzvvuuukqmjgbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391227.7728863-102-155131010993663/AnsiballZ_stat.py'
Oct 02 07:47:08 compute-0 sudo[61680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:08 compute-0 python3.9[61682]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:47:08 compute-0 sudo[61680]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:08 compute-0 sudo[61803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upurtuarsipaltbfkyurjlodmglybfii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391227.7728863-102-155131010993663/AnsiballZ_copy.py'
Oct 02 07:47:08 compute-0 sudo[61803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:09 compute-0 python3.9[61805]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391227.7728863-102-155131010993663/.source _original_basename=.86v2f_w4 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:47:09 compute-0 sudo[61803]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:09 compute-0 sudo[61955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlowqqdpmuzptktwmydqugqcmhedvvgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391229.4021003-134-6129379813649/AnsiballZ_file.py'
Oct 02 07:47:09 compute-0 sudo[61955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:09 compute-0 python3.9[61957]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:47:09 compute-0 sudo[61955]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:10 compute-0 sudo[62107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omvretqvyrlhbgyrnxredeonvbboczjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391230.1783743-150-26082927326349/AnsiballZ_stat.py'
Oct 02 07:47:10 compute-0 sudo[62107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:10 compute-0 python3.9[62109]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:47:10 compute-0 sudo[62107]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:11 compute-0 sudo[62230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvwoaxbgjheauivqqliqgpfodbjjghwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391230.1783743-150-26082927326349/AnsiballZ_copy.py'
Oct 02 07:47:11 compute-0 sudo[62230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:11 compute-0 python3.9[62232]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759391230.1783743-150-26082927326349/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:47:11 compute-0 sudo[62230]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:11 compute-0 sudo[62382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sovzkujjrciowbpaugxhgqoswwkvqemg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391231.3973749-150-186027781366664/AnsiballZ_stat.py'
Oct 02 07:47:11 compute-0 sudo[62382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:11 compute-0 python3.9[62384]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:47:11 compute-0 sudo[62382]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:12 compute-0 sudo[62505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxxgeampmervegvefbtcfrcbauslnhcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391231.3973749-150-186027781366664/AnsiballZ_copy.py'
Oct 02 07:47:12 compute-0 sudo[62505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:12 compute-0 python3.9[62507]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759391231.3973749-150-186027781366664/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:47:12 compute-0 sudo[62505]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:12 compute-0 sudo[62657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umbtkbkaqmtvqvwxuyzmjvwskoycgbxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391232.6464453-208-41565501936050/AnsiballZ_file.py'
Oct 02 07:47:12 compute-0 sudo[62657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:13 compute-0 python3.9[62659]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:47:13 compute-0 sudo[62657]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:13 compute-0 sudo[62809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsimtewhcsdyqwkrolmrniybvgzxyzan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391233.318403-224-202803162090347/AnsiballZ_stat.py'
Oct 02 07:47:13 compute-0 sudo[62809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:13 compute-0 python3.9[62811]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:47:13 compute-0 sudo[62809]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:14 compute-0 sudo[62932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysfwjqermbktmcrjhjltphejsmlcjuxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391233.318403-224-202803162090347/AnsiballZ_copy.py'
Oct 02 07:47:14 compute-0 sudo[62932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:14 compute-0 python3.9[62934]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391233.318403-224-202803162090347/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:47:14 compute-0 sudo[62932]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:15 compute-0 sudo[63084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbwdtafyizmyzevcwylycratdibztoxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391234.7497163-254-180802928189798/AnsiballZ_stat.py'
Oct 02 07:47:15 compute-0 sudo[63084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:15 compute-0 python3.9[63086]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:47:15 compute-0 sudo[63084]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:15 compute-0 sudo[63207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrucunsswdgslvafcrgvugoizttkbcgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391234.7497163-254-180802928189798/AnsiballZ_copy.py'
Oct 02 07:47:15 compute-0 sudo[63207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:15 compute-0 python3.9[63209]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391234.7497163-254-180802928189798/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:47:15 compute-0 sudo[63207]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:16 compute-0 sudo[63359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxpbqrkehnekqovbxyxfvnsicosybbhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391236.1187193-284-67149635577343/AnsiballZ_systemd.py'
Oct 02 07:47:16 compute-0 sudo[63359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:17 compute-0 python3.9[63361]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:47:17 compute-0 systemd[1]: Reloading.
Oct 02 07:47:17 compute-0 systemd-sysv-generator[63390]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:47:17 compute-0 systemd-rc-local-generator[63384]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:47:17 compute-0 systemd[1]: Reloading.
Oct 02 07:47:17 compute-0 systemd-rc-local-generator[63427]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:47:17 compute-0 systemd-sysv-generator[63432]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:47:17 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Oct 02 07:47:17 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Oct 02 07:47:17 compute-0 sudo[63359]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:18 compute-0 sudo[63587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuwqxmssdgzmgnpkoktgldsfzgsjmmgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391237.9637673-300-63015026109274/AnsiballZ_stat.py'
Oct 02 07:47:18 compute-0 sudo[63587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:18 compute-0 python3.9[63589]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:47:18 compute-0 sudo[63587]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:18 compute-0 sudo[63710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkxynkkstwzmzckwjegveavcwaesvyua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391237.9637673-300-63015026109274/AnsiballZ_copy.py'
Oct 02 07:47:18 compute-0 sudo[63710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:19 compute-0 python3.9[63712]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391237.9637673-300-63015026109274/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:47:19 compute-0 sudo[63710]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:19 compute-0 sudo[63862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhpzphzvldqeqegxvzauepgxewhutlcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391239.3422081-330-49729654894389/AnsiballZ_stat.py'
Oct 02 07:47:19 compute-0 sudo[63862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:19 compute-0 python3.9[63864]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:47:19 compute-0 sudo[63862]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:20 compute-0 sudo[63985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auhjuaxtmpdhwlecinvfqjbyvrrdoskk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391239.3422081-330-49729654894389/AnsiballZ_copy.py'
Oct 02 07:47:20 compute-0 sudo[63985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:20 compute-0 python3.9[63987]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391239.3422081-330-49729654894389/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:47:20 compute-0 sudo[63985]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:21 compute-0 sudo[64137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvfhdjsufpbtkfdqbluoxftbygqnyrug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391240.7065458-360-15478060334556/AnsiballZ_systemd.py'
Oct 02 07:47:21 compute-0 sudo[64137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:21 compute-0 python3.9[64139]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:47:21 compute-0 systemd[1]: Reloading.
Oct 02 07:47:21 compute-0 systemd-sysv-generator[64167]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:47:21 compute-0 systemd-rc-local-generator[64163]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:47:21 compute-0 systemd[1]: Reloading.
Oct 02 07:47:21 compute-0 systemd-rc-local-generator[64203]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:47:21 compute-0 systemd-sysv-generator[64208]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:47:21 compute-0 systemd[1]: Starting Create netns directory...
Oct 02 07:47:21 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 02 07:47:21 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 02 07:47:21 compute-0 systemd[1]: Finished Create netns directory.
Oct 02 07:47:22 compute-0 sudo[64137]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:22 compute-0 python3.9[64365]: ansible-ansible.builtin.service_facts Invoked
Oct 02 07:47:23 compute-0 network[64382]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 02 07:47:23 compute-0 network[64383]: 'network-scripts' will be removed from distribution in near future.
Oct 02 07:47:23 compute-0 network[64384]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 02 07:47:27 compute-0 sudo[64646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wegqjvrqvljisbmxvmnifmeppzcxcxpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391246.7109272-392-123610460785289/AnsiballZ_systemd.py'
Oct 02 07:47:27 compute-0 sudo[64646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:27 compute-0 python3.9[64648]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:47:27 compute-0 systemd[1]: Reloading.
Oct 02 07:47:27 compute-0 systemd-sysv-generator[64680]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:47:27 compute-0 systemd-rc-local-generator[64674]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:47:27 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Oct 02 07:47:27 compute-0 iptables.init[64689]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct 02 07:47:28 compute-0 iptables.init[64689]: iptables: Flushing firewall rules: [  OK  ]
Oct 02 07:47:28 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Oct 02 07:47:28 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Oct 02 07:47:28 compute-0 sudo[64646]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:28 compute-0 sudo[64883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqrkofwmfflkmevzhyzjantyjqpxhhdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391248.2470098-392-71396279536723/AnsiballZ_systemd.py'
Oct 02 07:47:28 compute-0 sudo[64883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:28 compute-0 python3.9[64885]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:47:28 compute-0 sudo[64883]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:29 compute-0 sudo[65037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhiwxypnflyjalyeufgumogweiwwvsay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391249.2742753-424-45727255936942/AnsiballZ_systemd.py'
Oct 02 07:47:29 compute-0 sudo[65037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:29 compute-0 python3.9[65039]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:47:29 compute-0 systemd[1]: Reloading.
Oct 02 07:47:30 compute-0 systemd-rc-local-generator[65069]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:47:30 compute-0 systemd-sysv-generator[65074]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:47:30 compute-0 systemd[1]: Starting Netfilter Tables...
Oct 02 07:47:30 compute-0 systemd[1]: Finished Netfilter Tables.
Oct 02 07:47:30 compute-0 sudo[65037]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:30 compute-0 sudo[65229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnkbxzvzzgdpwdofrkfhgffdllohsaib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391250.4948485-440-89579115744948/AnsiballZ_command.py'
Oct 02 07:47:30 compute-0 sudo[65229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:31 compute-0 python3.9[65231]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:47:31 compute-0 sudo[65229]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:31 compute-0 sudo[65382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwpvvdgndutqhnxjkpaernelgajosvmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391251.635046-468-111711529256174/AnsiballZ_stat.py'
Oct 02 07:47:32 compute-0 sudo[65382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:32 compute-0 python3.9[65384]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:47:32 compute-0 sudo[65382]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:32 compute-0 sudo[65507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maovkpegbwrgzrsuijtbfqscebckimrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391251.635046-468-111711529256174/AnsiballZ_copy.py'
Oct 02 07:47:32 compute-0 sudo[65507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:47:32 compute-0 python3.9[65509]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391251.635046-468-111711529256174/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:47:33 compute-0 sudo[65507]: pam_unix(sudo:session): session closed for user root
Oct 02 07:47:33 compute-0 python3.9[65660]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:47:33 compute-0 polkitd[6272]: Registered Authentication Agent for unix-process:65662:214143 (system bus name :1.553 [/usr/bin/pkttyagent --notify-fd 5 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Oct 02 07:47:58 compute-0 polkit-agent-helper-1[65674]: pam_unix(polkit-1:auth): conversation failed
Oct 02 07:47:58 compute-0 polkit-agent-helper-1[65674]: pam_unix(polkit-1:auth): auth could not identify password for [root]
Oct 02 07:47:58 compute-0 polkitd[6272]: Unregistered Authentication Agent for unix-process:65662:214143 (system bus name :1.553, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Oct 02 07:47:58 compute-0 polkitd[6272]: Operator of unix-process:65662:214143 FAILED to authenticate to gain authorization for action org.freedesktop.systemd1.manage-units for system-bus-name::1.552 [<unknown>] (owned by unix-user:zuul)
Oct 02 07:47:59 compute-0 sshd-session[60971]: Connection closed by 192.168.122.30 port 55664
Oct 02 07:47:59 compute-0 sshd-session[60968]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:47:59 compute-0 systemd-logind[827]: Session 15 logged out. Waiting for processes to exit.
Oct 02 07:47:59 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Oct 02 07:47:59 compute-0 systemd[1]: session-15.scope: Consumed 21.861s CPU time.
Oct 02 07:47:59 compute-0 systemd-logind[827]: Removed session 15.
Oct 02 07:48:11 compute-0 sshd-session[65700]: Accepted publickey for zuul from 192.168.122.30 port 59150 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:48:11 compute-0 systemd-logind[827]: New session 16 of user zuul.
Oct 02 07:48:11 compute-0 systemd[1]: Started Session 16 of User zuul.
Oct 02 07:48:11 compute-0 sshd-session[65700]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:48:12 compute-0 python3.9[65853]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:48:13 compute-0 sudo[66007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxepavtdrdapetqlldcijagfvefxemcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391293.4622147-46-29347768314316/AnsiballZ_file.py'
Oct 02 07:48:13 compute-0 sudo[66007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:14 compute-0 python3.9[66009]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:14 compute-0 sudo[66007]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:15 compute-0 sudo[66182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhapjjflyvvggqigszocodjysqmfprrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391294.4183745-62-237396306525218/AnsiballZ_stat.py'
Oct 02 07:48:15 compute-0 sudo[66182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:15 compute-0 python3.9[66184]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:15 compute-0 sudo[66182]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:15 compute-0 sudo[66260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiviwnyldavftxzkhhoagdbqvupppurx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391294.4183745-62-237396306525218/AnsiballZ_file.py'
Oct 02 07:48:15 compute-0 sudo[66260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:15 compute-0 python3.9[66262]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.55zwiyrv recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:15 compute-0 sudo[66260]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:16 compute-0 sudo[66412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brrjefuyiyoxdcdnvvhhyrauovajhvvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391296.1801386-102-46853982752335/AnsiballZ_stat.py'
Oct 02 07:48:16 compute-0 sudo[66412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:16 compute-0 python3.9[66414]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:16 compute-0 sudo[66412]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:16 compute-0 sudo[66490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyuwhhlxzwldfkbfeohkxnfaplfjpokp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391296.1801386-102-46853982752335/AnsiballZ_file.py'
Oct 02 07:48:16 compute-0 sudo[66490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:17 compute-0 python3.9[66492]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.rr4iab19 recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:17 compute-0 sudo[66490]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:17 compute-0 sudo[66642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csubrjddkkxgytmyqmpqtmyustshigkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391297.3206868-128-135284615289040/AnsiballZ_file.py'
Oct 02 07:48:17 compute-0 sudo[66642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:17 compute-0 python3.9[66644]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:48:17 compute-0 sudo[66642]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:18 compute-0 sudo[66794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqmoktpridvqikkrgfervifdwomuqllw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391298.0632496-144-206498688150750/AnsiballZ_stat.py'
Oct 02 07:48:18 compute-0 sudo[66794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:18 compute-0 python3.9[66796]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:18 compute-0 sudo[66794]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:18 compute-0 sudo[66872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-injodjjavjwybnfznlwqrdorujfqohjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391298.0632496-144-206498688150750/AnsiballZ_file.py'
Oct 02 07:48:18 compute-0 sudo[66872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:19 compute-0 python3.9[66874]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:48:19 compute-0 sudo[66872]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:19 compute-0 sudo[67024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhxiccmbbihzwqccdmbeezumzpaeluuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391299.2827513-144-228239183470829/AnsiballZ_stat.py'
Oct 02 07:48:19 compute-0 sudo[67024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:19 compute-0 python3.9[67026]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:19 compute-0 sudo[67024]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:20 compute-0 sudo[67102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpyppaovkinupsnieegtmdvlbdvrdvsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391299.2827513-144-228239183470829/AnsiballZ_file.py'
Oct 02 07:48:20 compute-0 sudo[67102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:20 compute-0 python3.9[67104]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:48:20 compute-0 sudo[67102]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:20 compute-0 sudo[67254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuubceqkcfkuzxxjkrzgmqsmcsmdykkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391300.566119-190-122664795737060/AnsiballZ_file.py'
Oct 02 07:48:20 compute-0 sudo[67254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:21 compute-0 python3.9[67256]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:21 compute-0 sudo[67254]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:21 compute-0 sudo[67406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfphoflxwefnvruixbmgbybvgwccdfcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391301.28887-206-51788246142364/AnsiballZ_stat.py'
Oct 02 07:48:21 compute-0 sudo[67406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:21 compute-0 python3.9[67408]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:21 compute-0 sudo[67406]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:22 compute-0 sudo[67484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oquxsrofytlgkntggbivdmzfwhpvlcku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391301.28887-206-51788246142364/AnsiballZ_file.py'
Oct 02 07:48:22 compute-0 sudo[67484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:22 compute-0 python3.9[67486]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:22 compute-0 sudo[67484]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:23 compute-0 sudo[67636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrrodroxqjcqtcbkflhaqrzcedqkjgqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391302.6963546-230-52432034510843/AnsiballZ_stat.py'
Oct 02 07:48:23 compute-0 sudo[67636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:23 compute-0 python3.9[67638]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:23 compute-0 sudo[67636]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:23 compute-0 sudo[67714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etaicmhqxtlqnlboegzqglnfvvhmvsyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391302.6963546-230-52432034510843/AnsiballZ_file.py'
Oct 02 07:48:23 compute-0 sudo[67714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:23 compute-0 python3.9[67716]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:23 compute-0 sudo[67714]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:24 compute-0 sudo[67866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvxujizafczddbbtnjbetkvdqrpygpea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391303.9552464-254-93049386452597/AnsiballZ_systemd.py'
Oct 02 07:48:24 compute-0 sudo[67866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:25 compute-0 python3.9[67868]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:48:25 compute-0 systemd[1]: Reloading.
Oct 02 07:48:25 compute-0 systemd-rc-local-generator[67896]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:48:25 compute-0 systemd-sysv-generator[67899]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:48:25 compute-0 sudo[67866]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:25 compute-0 sudo[68055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdghcehgbxciwselcyayukihbsvccysv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391305.5096343-270-235265383172887/AnsiballZ_stat.py'
Oct 02 07:48:25 compute-0 sudo[68055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:26 compute-0 python3.9[68057]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:26 compute-0 sudo[68055]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:26 compute-0 sudo[68133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnltbqfkbpgqqdlmivajfevgzaxwaicz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391305.5096343-270-235265383172887/AnsiballZ_file.py'
Oct 02 07:48:26 compute-0 sudo[68133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:26 compute-0 python3.9[68135]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:26 compute-0 sudo[68133]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:27 compute-0 sudo[68285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaivtdrmzptboqzbmvuehfhtnuouaqth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391306.7088923-294-17738317732504/AnsiballZ_stat.py'
Oct 02 07:48:27 compute-0 sudo[68285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:27 compute-0 python3.9[68287]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:27 compute-0 sudo[68285]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:27 compute-0 sudo[68363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdccjxzlkvkislnylsexlrrvnblvienf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391306.7088923-294-17738317732504/AnsiballZ_file.py'
Oct 02 07:48:27 compute-0 sudo[68363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:27 compute-0 python3.9[68365]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:27 compute-0 sudo[68363]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:28 compute-0 sudo[68515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykpojkoeclbmmaxpnfmaqtmlsbfsfcda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391307.9685173-318-183958719594998/AnsiballZ_systemd.py'
Oct 02 07:48:28 compute-0 sudo[68515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:28 compute-0 python3.9[68517]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:48:28 compute-0 systemd[1]: Reloading.
Oct 02 07:48:28 compute-0 systemd-rc-local-generator[68545]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:48:28 compute-0 systemd-sysv-generator[68548]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:48:28 compute-0 systemd[1]: Starting Create netns directory...
Oct 02 07:48:28 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 02 07:48:28 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 02 07:48:28 compute-0 systemd[1]: Finished Create netns directory.
Oct 02 07:48:28 compute-0 sudo[68515]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:29 compute-0 python3.9[68710]: ansible-ansible.builtin.service_facts Invoked
Oct 02 07:48:30 compute-0 network[68727]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 02 07:48:30 compute-0 network[68728]: 'network-scripts' will be removed from distribution in near future.
Oct 02 07:48:30 compute-0 network[68729]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 02 07:48:35 compute-0 sudo[68990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sukaltdiekhgmuybsjlkyhcoltecyglt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391315.1373076-370-146199982255507/AnsiballZ_stat.py'
Oct 02 07:48:35 compute-0 sudo[68990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:35 compute-0 python3.9[68992]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:35 compute-0 sudo[68990]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:35 compute-0 sudo[69068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmklscszsmgrozfbsourrdlbiroedwwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391315.1373076-370-146199982255507/AnsiballZ_file.py'
Oct 02 07:48:35 compute-0 sudo[69068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:36 compute-0 python3.9[69070]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:36 compute-0 sudo[69068]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:36 compute-0 sudo[69220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkmjphataeroevujrvfegurbokuufipb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391316.3515468-396-25964689395574/AnsiballZ_file.py'
Oct 02 07:48:36 compute-0 sudo[69220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:36 compute-0 python3.9[69222]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:36 compute-0 sudo[69220]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:37 compute-0 sudo[69372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emueyyqojvqskurrswriwdjubcahryyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391317.2242107-412-98787297174971/AnsiballZ_stat.py'
Oct 02 07:48:37 compute-0 sudo[69372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:37 compute-0 python3.9[69374]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:37 compute-0 sudo[69372]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:38 compute-0 sudo[69495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tayxofxdnedxtjccoylfbyxsopxzvyno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391317.2242107-412-98787297174971/AnsiballZ_copy.py'
Oct 02 07:48:38 compute-0 sudo[69495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:38 compute-0 python3.9[69497]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391317.2242107-412-98787297174971/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:38 compute-0 sudo[69495]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:39 compute-0 sudo[69647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kchamuhnhaesytldmyfubzvmhxrlusri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391319.0527608-448-180849619452495/AnsiballZ_timezone.py'
Oct 02 07:48:39 compute-0 sudo[69647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:39 compute-0 python3.9[69649]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 02 07:48:39 compute-0 systemd[1]: Starting Time & Date Service...
Oct 02 07:48:39 compute-0 systemd[1]: Started Time & Date Service.
Oct 02 07:48:39 compute-0 sudo[69647]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:40 compute-0 sudo[69803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dldyqzqdmxjmtuzynflbvvmpulodtxtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391320.237902-466-120504075046202/AnsiballZ_file.py'
Oct 02 07:48:40 compute-0 sudo[69803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:40 compute-0 python3.9[69805]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:40 compute-0 sudo[69803]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:41 compute-0 sudo[69955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynppwbfcfeygkbtmwmrsultunrocbpoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391321.0347729-482-190765969499096/AnsiballZ_stat.py'
Oct 02 07:48:41 compute-0 sudo[69955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:41 compute-0 python3.9[69957]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:41 compute-0 sudo[69955]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:42 compute-0 sudo[70078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqfxxweysetpblahbzvmuaabtemzloft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391321.0347729-482-190765969499096/AnsiballZ_copy.py'
Oct 02 07:48:42 compute-0 sudo[70078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:42 compute-0 python3.9[70080]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391321.0347729-482-190765969499096/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:42 compute-0 sudo[70078]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:42 compute-0 sudo[70230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teigvbdxkhysindpaotralutgautuoql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391322.4845521-512-24288339922264/AnsiballZ_stat.py'
Oct 02 07:48:42 compute-0 sudo[70230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:43 compute-0 python3.9[70232]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:43 compute-0 sudo[70230]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:43 compute-0 sudo[70353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpkhtjjtrzyntpicfmqrrxkfxzsxrpbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391322.4845521-512-24288339922264/AnsiballZ_copy.py'
Oct 02 07:48:43 compute-0 sudo[70353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:43 compute-0 python3.9[70355]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391322.4845521-512-24288339922264/.source.yaml _original_basename=.ls57xyrt follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:43 compute-0 sudo[70353]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:44 compute-0 sudo[70505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liptwenawiscwojdzmrlwhzsjzzmreun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391323.820515-542-269414957888887/AnsiballZ_stat.py'
Oct 02 07:48:44 compute-0 sudo[70505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:44 compute-0 python3.9[70507]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:44 compute-0 sudo[70505]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:44 compute-0 sudo[70628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfpdtyufpxpclqbipmndqhjarfecotzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391323.820515-542-269414957888887/AnsiballZ_copy.py'
Oct 02 07:48:44 compute-0 sudo[70628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:45 compute-0 python3.9[70630]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391323.820515-542-269414957888887/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:45 compute-0 sudo[70628]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:45 compute-0 sudo[70780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxmekgtroyqdnwodvsevbqesggylylps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391325.3629067-572-107671317483417/AnsiballZ_command.py'
Oct 02 07:48:45 compute-0 sudo[70780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:46 compute-0 python3.9[70782]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:48:46 compute-0 sudo[70780]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:46 compute-0 sudo[70933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twzkkmyjarujcbwzztjzncdirdmgehwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391326.3544183-588-151876395334177/AnsiballZ_command.py'
Oct 02 07:48:46 compute-0 sudo[70933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:46 compute-0 python3.9[70935]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:48:46 compute-0 sudo[70933]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:47 compute-0 sudo[71086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afgizkwrpqbfiqwpeztswwmgznjpyqry ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759391327.1827455-604-154294854042579/AnsiballZ_edpm_nftables_from_files.py'
Oct 02 07:48:47 compute-0 sudo[71086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:47 compute-0 python3[71088]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 02 07:48:47 compute-0 sudo[71086]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:48 compute-0 sudo[71238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkqibwbsousszzrqnkvfwbjpbortehka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391328.1641712-620-280972192000132/AnsiballZ_stat.py'
Oct 02 07:48:48 compute-0 sudo[71238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:48 compute-0 python3.9[71240]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:48 compute-0 sudo[71238]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:49 compute-0 sudo[71361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vytpkhjmdjbrkpnkpzawvnynvxotgogm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391328.1641712-620-280972192000132/AnsiballZ_copy.py'
Oct 02 07:48:49 compute-0 sudo[71361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:49 compute-0 python3.9[71363]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391328.1641712-620-280972192000132/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:49 compute-0 sudo[71361]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:49 compute-0 sudo[71513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvzmghejglqtvnynisoxpgumdzksanwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391329.5581062-650-209971786743815/AnsiballZ_stat.py'
Oct 02 07:48:49 compute-0 sudo[71513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:50 compute-0 python3.9[71515]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:50 compute-0 sudo[71513]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:50 compute-0 sudo[71636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkzktbfcdgijwipmodwfxtxcgdkzkgke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391329.5581062-650-209971786743815/AnsiballZ_copy.py'
Oct 02 07:48:50 compute-0 sudo[71636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:50 compute-0 python3.9[71638]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391329.5581062-650-209971786743815/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:50 compute-0 sudo[71636]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:51 compute-0 sudo[71788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsylffvkmdkizztunlqhhtpkuypqjivu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391331.0489068-680-14590629969349/AnsiballZ_stat.py'
Oct 02 07:48:51 compute-0 sudo[71788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:51 compute-0 python3.9[71790]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:51 compute-0 sudo[71788]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:52 compute-0 sudo[71911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atnvqvrbeqkioswykdcwxjehysqqrjyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391331.0489068-680-14590629969349/AnsiballZ_copy.py'
Oct 02 07:48:52 compute-0 sudo[71911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:52 compute-0 python3.9[71913]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391331.0489068-680-14590629969349/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:52 compute-0 sudo[71911]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:52 compute-0 sudo[72063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elwnxlvtkxscfijabgvotvfabmxewexb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391332.456585-710-87107027980256/AnsiballZ_stat.py'
Oct 02 07:48:52 compute-0 sudo[72063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:53 compute-0 python3.9[72065]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:53 compute-0 sudo[72063]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:53 compute-0 sudo[72186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vweidluofrbbqqnnznwcgtctyunjcgph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391332.456585-710-87107027980256/AnsiballZ_copy.py'
Oct 02 07:48:53 compute-0 sudo[72186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:53 compute-0 python3.9[72188]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391332.456585-710-87107027980256/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:53 compute-0 sudo[72186]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:54 compute-0 sudo[72338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nemyhljiqkwblpxtgbotskyjalhjpfnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391334.012169-740-163218898791389/AnsiballZ_stat.py'
Oct 02 07:48:54 compute-0 sudo[72338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:54 compute-0 python3.9[72340]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:48:54 compute-0 sudo[72338]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:55 compute-0 sudo[72461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpgzgjtjcmzuaghnwhzbfucnwaepkypy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391334.012169-740-163218898791389/AnsiballZ_copy.py'
Oct 02 07:48:55 compute-0 sudo[72461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:55 compute-0 python3.9[72463]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391334.012169-740-163218898791389/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:55 compute-0 sudo[72461]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:55 compute-0 sudo[72613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfuxoehnefqfygmrsftycjdkkfwjgney ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391335.585373-770-29727153956517/AnsiballZ_file.py'
Oct 02 07:48:55 compute-0 sudo[72613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:56 compute-0 python3.9[72615]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:56 compute-0 sudo[72613]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:56 compute-0 sudo[72765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmugaiwmeyccdtpjhxwoeafwvkrygbqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391336.403902-786-31242156816256/AnsiballZ_command.py'
Oct 02 07:48:56 compute-0 sudo[72765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:56 compute-0 python3.9[72767]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:48:57 compute-0 sudo[72765]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:57 compute-0 sudo[72924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opnuktoxlobpdllccizwdhimmmhcqfoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391337.295415-802-128974886935421/AnsiballZ_blockinfile.py'
Oct 02 07:48:57 compute-0 sudo[72924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:58 compute-0 python3.9[72926]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:58 compute-0 sudo[72924]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:58 compute-0 sudo[73077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txwjqatxdqanlmulayxpgrmwkwyhslro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391338.3402953-820-181709628868721/AnsiballZ_file.py'
Oct 02 07:48:58 compute-0 sudo[73077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:58 compute-0 python3.9[73079]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:58 compute-0 sudo[73077]: pam_unix(sudo:session): session closed for user root
Oct 02 07:48:59 compute-0 sudo[73229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmlfjeemapvifyqlnnfgloubedwnitrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391339.1040664-820-158435382167386/AnsiballZ_file.py'
Oct 02 07:48:59 compute-0 sudo[73229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:48:59 compute-0 python3.9[73231]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:48:59 compute-0 sudo[73229]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:00 compute-0 sudo[73381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svyyvqyxwknohmaqxmoijcwgrupqmghz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391339.9123633-850-49983549536370/AnsiballZ_mount.py'
Oct 02 07:49:00 compute-0 sudo[73381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:00 compute-0 python3.9[73383]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 02 07:49:00 compute-0 sudo[73381]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:00 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 02 07:49:00 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 02 07:49:01 compute-0 sudo[73535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyxomeqdawnserjigxojblkbvhseqyej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391340.8815038-850-4573619726390/AnsiballZ_mount.py'
Oct 02 07:49:01 compute-0 sudo[73535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:01 compute-0 python3.9[73537]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 02 07:49:01 compute-0 sudo[73535]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:01 compute-0 sshd-session[65703]: Connection closed by 192.168.122.30 port 59150
Oct 02 07:49:01 compute-0 sshd-session[65700]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:49:01 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Oct 02 07:49:01 compute-0 systemd[1]: session-16.scope: Consumed 36.533s CPU time.
Oct 02 07:49:01 compute-0 systemd-logind[827]: Session 16 logged out. Waiting for processes to exit.
Oct 02 07:49:01 compute-0 systemd-logind[827]: Removed session 16.
Oct 02 07:49:07 compute-0 chronyd[60942]: Selected source 216.232.132.95 (pool.ntp.org)
Oct 02 07:49:07 compute-0 sshd-session[73563]: Accepted publickey for zuul from 192.168.122.30 port 45668 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:49:07 compute-0 systemd-logind[827]: New session 17 of user zuul.
Oct 02 07:49:07 compute-0 systemd[1]: Started Session 17 of User zuul.
Oct 02 07:49:07 compute-0 sshd-session[73563]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:49:08 compute-0 sudo[73716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxlhvdtrkpjuitzlhynkbjvodlurpdod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391347.610956-17-7107851911130/AnsiballZ_tempfile.py'
Oct 02 07:49:08 compute-0 sudo[73716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:08 compute-0 python3.9[73718]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct 02 07:49:08 compute-0 sudo[73716]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:08 compute-0 sudo[73868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebqyrxhbneijbrovlbgradaewdppbkmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391348.5575645-41-158969796077788/AnsiballZ_stat.py'
Oct 02 07:49:08 compute-0 sudo[73868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:09 compute-0 python3.9[73870]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:49:09 compute-0 sudo[73868]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:09 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 02 07:49:10 compute-0 sudo[74022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cihpwhnmsgkeaidjoixxlszhzbwsbkol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391349.5228171-61-117751386991224/AnsiballZ_setup.py'
Oct 02 07:49:10 compute-0 sudo[74022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:10 compute-0 python3.9[74024]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:49:10 compute-0 sudo[74022]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:11 compute-0 sudo[74174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgwcjdgkrnemyislbepsdkvfqetdcmak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391350.7891185-78-72117404813959/AnsiballZ_blockinfile.py'
Oct 02 07:49:11 compute-0 sudo[74174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:11 compute-0 python3.9[74176]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDBDT4Fd4G2Cmk9uvXbKHwWysjzqKrtl8C1d8zP2LIa4nDohgRyvywSwLJSm8DNQKobtbA0fib8Eh0dnVLXS/S/zpKf/ioxuyTR2YcjFwm/JmKjPyQP1zyfHdPzSbEyy7Z7yC1VaFTwDso5Bmm5Q8EUr3sssxGzIeTwRfpL8xP3Ncwfe+uLGg3R3cicoQkcKYZyKmQqlVn1frc7XR+Xc7dJgTvxrNpRi/YWVtyoQZrlgSvM6O7MrAbcjy+ZNYtQ3arjVPvJWdZjBzbQ3DTdUvfWatuL7lHX6bYzM9UtEKzf1bnzT5Ke6lg/wPFWXrz4reWKdaCoBftlc63V23uzMXS0+m+27f6P2cC7Nb/MyERq31VUaT6l8Sp4qYI2OJAAllRodv3BfE8ioc3Y8/w1l1qJgWgJGaL+5bNEBT6zob2FH3aLGQslqCpdEkXQM3VKuh7PxKrU2F7++xUJ5HFZGQvrtAPicuTiHVzgYm9XhWFeNaWooMZCDRHUNmXm9bojJkE=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIItu+/2aO5i4gOqT/ek26dSXaVl3cymNNzx/UnQTr+RG
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMenFaCh1mWEKihJIeSCKBRcxx5/XFzOzoysIPmTKXNbflYudEzZwfxa7Xi+VoJBFBr0qcgSGMNJunJExPCzdDQ=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCNczjv+lRAtr+PVbEZ1zEXmgI4BV0+MCzCBVXmOtP4lEwc8H9EMz7OiJAz/2vXHa+BhaZDue37Elo1Y7Nx07Hjp5VmzhKUS0C4J+i8w4YMLBKjQHtS1XISgYJZ5mZSpm/eYibqG4A7j0La6+hRplDABequGRLRHz+XAEzBVWl6DvHhofNQTg78jGYKsOROBjFnCw1E1it9MsS67EBjRtt/f7Zsm4sPkuk1PnunOqh3UJkVds+8D2Pd6g29qDZvrvvkb8zFtyloGQD54MnxodKZ43FaFO5ef9hxoR+fTMYl6ADAHLy93Swtkev9weYPczF13ERS79tnbhFDsWm+TAceAV82KDyrs1+5uSh0Nym5n0tAeTlVLTV0H/4pcuvtd9d+YqZEEHmbg778igs+eBgjWkM3imyiwCi+Pn5tdW41dAWPr/w3XzUVrm7SB7NVsylUbe2Woz5rBagk6QzhmipiJWEKqQ54vT2E0assY2JcGaf5SLJ9XMe572hJ4sVy8lk=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOj3RqMaXOMjoIja1qkK/pLD4xu74QmkyYHdCQFLQIkW
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLluoyjX3rJhNy6CY19x0XekmJToS5yyoN/f2Bw+qUCFTSUt44d1IyvUYmwUzQ6BNPyh0urjoqiaCBgG1FeI67Y=
                                             create=True mode=0644 path=/tmp/ansible.liygm45p state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:49:11 compute-0 sudo[74174]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:12 compute-0 sudo[74326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbcuafwsbsjjqvohqcujmylgtzltsnyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391351.665564-94-173604717552083/AnsiballZ_command.py'
Oct 02 07:49:12 compute-0 sudo[74326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:12 compute-0 python3.9[74328]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.liygm45p' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:49:12 compute-0 sudo[74326]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:12 compute-0 sudo[74480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcaobgrcslazgdkjlerovhzauspoocro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391352.5140066-110-72101527112757/AnsiballZ_file.py'
Oct 02 07:49:12 compute-0 sudo[74480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:13 compute-0 python3.9[74482]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.liygm45p state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:49:13 compute-0 sudo[74480]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:13 compute-0 sshd-session[73566]: Connection closed by 192.168.122.30 port 45668
Oct 02 07:49:13 compute-0 sshd-session[73563]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:49:13 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Oct 02 07:49:13 compute-0 systemd[1]: session-17.scope: Consumed 3.803s CPU time.
Oct 02 07:49:13 compute-0 systemd-logind[827]: Session 17 logged out. Waiting for processes to exit.
Oct 02 07:49:13 compute-0 systemd-logind[827]: Removed session 17.
Oct 02 07:49:19 compute-0 sshd-session[74507]: Accepted publickey for zuul from 192.168.122.30 port 56602 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:49:19 compute-0 systemd-logind[827]: New session 18 of user zuul.
Oct 02 07:49:19 compute-0 systemd[1]: Started Session 18 of User zuul.
Oct 02 07:49:19 compute-0 sshd-session[74507]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:49:20 compute-0 python3.9[74660]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:49:21 compute-0 sudo[74814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngssjmtvrapjlsfdptbkhwdkjvymjokv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391360.9790714-44-174203098917794/AnsiballZ_systemd.py'
Oct 02 07:49:21 compute-0 sudo[74814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:21 compute-0 python3.9[74816]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 02 07:49:22 compute-0 sudo[74814]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:22 compute-0 sudo[74968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbqhnixslbfyitdortetwghmbkpbfrad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391362.2496135-60-172016926739309/AnsiballZ_systemd.py'
Oct 02 07:49:22 compute-0 sudo[74968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:22 compute-0 python3.9[74970]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:49:23 compute-0 sudo[74968]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:24 compute-0 sudo[75121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjfgvibpjbpnsacxmesbxgxycrasoncr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391364.1502368-78-47053068801323/AnsiballZ_command.py'
Oct 02 07:49:24 compute-0 sudo[75121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:24 compute-0 python3.9[75123]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:49:24 compute-0 sudo[75121]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:25 compute-0 sudo[75274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxivhmfooknbninwsgxfdvvaazwusrkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391364.9992938-94-38663785015142/AnsiballZ_stat.py'
Oct 02 07:49:25 compute-0 sudo[75274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:25 compute-0 python3.9[75276]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:49:25 compute-0 sudo[75274]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:26 compute-0 sudo[75428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecizjngapsilsxnslirjiskmdelesgdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391365.9256027-110-43629165458826/AnsiballZ_command.py'
Oct 02 07:49:26 compute-0 sudo[75428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:26 compute-0 python3.9[75430]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:49:26 compute-0 sudo[75428]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:27 compute-0 sudo[75583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceovadbrslcbviguhdgzgqmjqjepesqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391366.793696-126-149029532404671/AnsiballZ_file.py'
Oct 02 07:49:27 compute-0 sudo[75583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:27 compute-0 python3.9[75585]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:49:27 compute-0 sudo[75583]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:27 compute-0 sshd-session[74510]: Connection closed by 192.168.122.30 port 56602
Oct 02 07:49:27 compute-0 sshd-session[74507]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:49:27 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Oct 02 07:49:27 compute-0 systemd[1]: session-18.scope: Consumed 5.341s CPU time.
Oct 02 07:49:27 compute-0 systemd-logind[827]: Session 18 logged out. Waiting for processes to exit.
Oct 02 07:49:27 compute-0 systemd-logind[827]: Removed session 18.
Oct 02 07:49:33 compute-0 sshd-session[75611]: Accepted publickey for zuul from 192.168.122.30 port 36234 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:49:33 compute-0 systemd-logind[827]: New session 19 of user zuul.
Oct 02 07:49:33 compute-0 systemd[1]: Started Session 19 of User zuul.
Oct 02 07:49:33 compute-0 sshd-session[75611]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:49:34 compute-0 python3.9[75764]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:49:35 compute-0 sudo[75918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwlpzsbwaikxpkiezvaqyqklaakadalt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391375.3913305-48-77090041516923/AnsiballZ_setup.py'
Oct 02 07:49:35 compute-0 sudo[75918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:36 compute-0 python3.9[75920]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:49:36 compute-0 sudo[75918]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:36 compute-0 sudo[76002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqssqglxtssfjzphknsviqrxjpxbzoxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391375.3913305-48-77090041516923/AnsiballZ_dnf.py'
Oct 02 07:49:36 compute-0 sudo[76002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:37 compute-0 python3.9[76004]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 02 07:49:38 compute-0 sudo[76002]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:39 compute-0 python3.9[76155]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:49:40 compute-0 python3.9[76306]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 02 07:49:41 compute-0 python3.9[76456]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:49:42 compute-0 python3.9[76606]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:49:42 compute-0 sshd-session[75614]: Connection closed by 192.168.122.30 port 36234
Oct 02 07:49:42 compute-0 sshd-session[75611]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:49:43 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Oct 02 07:49:43 compute-0 systemd[1]: session-19.scope: Consumed 6.679s CPU time.
Oct 02 07:49:43 compute-0 systemd-logind[827]: Session 19 logged out. Waiting for processes to exit.
Oct 02 07:49:43 compute-0 systemd-logind[827]: Removed session 19.
Oct 02 07:49:48 compute-0 sshd-session[76631]: Accepted publickey for zuul from 192.168.122.30 port 37492 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:49:48 compute-0 systemd-logind[827]: New session 20 of user zuul.
Oct 02 07:49:48 compute-0 systemd[1]: Started Session 20 of User zuul.
Oct 02 07:49:48 compute-0 sshd-session[76631]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:49:50 compute-0 python3.9[76784]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:49:51 compute-0 sudo[76938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buegjkhpypgbilukhhukinrckmhxcfwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391391.3778853-81-280433152450521/AnsiballZ_file.py'
Oct 02 07:49:51 compute-0 sudo[76938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:52 compute-0 python3.9[76940]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:49:52 compute-0 sudo[76938]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:52 compute-0 sudo[77090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kplemhzmzlbkdvbksepehsalefijmjgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391392.2974381-81-3280657417348/AnsiballZ_file.py'
Oct 02 07:49:52 compute-0 sudo[77090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:52 compute-0 python3.9[77092]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:49:52 compute-0 sudo[77090]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:53 compute-0 sudo[77242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqohotvddmwyhnlqslkxqocbjjkxcgsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391393.1089818-113-110858752240691/AnsiballZ_stat.py'
Oct 02 07:49:53 compute-0 sudo[77242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:53 compute-0 python3.9[77244]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:49:53 compute-0 sudo[77242]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:54 compute-0 sudo[77365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxpbobnksmgyxfduvniekmmnhounwrtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391393.1089818-113-110858752240691/AnsiballZ_copy.py'
Oct 02 07:49:54 compute-0 sudo[77365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:54 compute-0 python3.9[77367]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391393.1089818-113-110858752240691/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=f0d40a8996f766ac1f174afb546dc177c7916a5e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:49:54 compute-0 sudo[77365]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:55 compute-0 sudo[77517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqeviaoimvnyvoihqlshiayahxqryyml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391394.7245176-113-261834589592433/AnsiballZ_stat.py'
Oct 02 07:49:55 compute-0 sudo[77517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:55 compute-0 python3.9[77519]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:49:55 compute-0 sudo[77517]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:55 compute-0 sudo[77640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltglchpravcpbzhndyxluzkcqcotaace ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391394.7245176-113-261834589592433/AnsiballZ_copy.py'
Oct 02 07:49:55 compute-0 sudo[77640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:55 compute-0 python3.9[77642]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391394.7245176-113-261834589592433/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=8f08c1fedf1d84a8a1e32492ff65ff643941e887 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:49:55 compute-0 sudo[77640]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:56 compute-0 sudo[77792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcwmtzkzlxujqwbeytyfxdgfonsggrwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391396.093997-113-2711662407435/AnsiballZ_stat.py'
Oct 02 07:49:56 compute-0 sudo[77792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:56 compute-0 python3.9[77794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:49:56 compute-0 sudo[77792]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:57 compute-0 sudo[77915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adgsdcsltekcvekbxpfwwkjllptrgevu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391396.093997-113-2711662407435/AnsiballZ_copy.py'
Oct 02 07:49:57 compute-0 sudo[77915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:57 compute-0 python3.9[77917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391396.093997-113-2711662407435/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=7828cf2fe18f0666722dc536ed962d5af8ae13ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:49:57 compute-0 sudo[77915]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:57 compute-0 sudo[78067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxnmzvcwlzgatluyhebsjqkbrtpqbltq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391397.5815725-197-49233211284999/AnsiballZ_file.py'
Oct 02 07:49:57 compute-0 sudo[78067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:58 compute-0 python3.9[78069]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:49:58 compute-0 sudo[78067]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:58 compute-0 sudo[78219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kflciaixjfgpzhktxvylmbgyuvhtpzjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391398.3342116-197-9978102246442/AnsiballZ_file.py'
Oct 02 07:49:58 compute-0 sudo[78219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:58 compute-0 python3.9[78221]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:49:58 compute-0 sudo[78219]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:59 compute-0 sudo[78371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjgxymxghsteidjzzbdofyveoonxysdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391399.0387235-228-164206262425560/AnsiballZ_stat.py'
Oct 02 07:49:59 compute-0 sudo[78371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:49:59 compute-0 python3.9[78373]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:49:59 compute-0 sudo[78371]: pam_unix(sudo:session): session closed for user root
Oct 02 07:49:59 compute-0 sudo[78494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eggummkzuxjbqkzbzyfeunlxsobiesuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391399.0387235-228-164206262425560/AnsiballZ_copy.py'
Oct 02 07:49:59 compute-0 sudo[78494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:00 compute-0 python3.9[78496]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391399.0387235-228-164206262425560/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=694d06d5fe416f2dcc6aef594504bdf9afce701d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:00 compute-0 sudo[78494]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:00 compute-0 sudo[78646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssgpgdwdexclreozsdehfpqhqgfrsayh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391400.3699641-228-253594654151546/AnsiballZ_stat.py'
Oct 02 07:50:00 compute-0 sudo[78646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:00 compute-0 python3.9[78648]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:01 compute-0 sudo[78646]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:01 compute-0 sudo[78769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsfiajhdxidudgwblkfrukhwjttskanf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391400.3699641-228-253594654151546/AnsiballZ_copy.py'
Oct 02 07:50:01 compute-0 sudo[78769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:01 compute-0 python3.9[78771]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391400.3699641-228-253594654151546/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=ae6e4d5078229dba3a3654cc2c2a83c57df65e7e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:01 compute-0 sudo[78769]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:02 compute-0 sudo[78921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryelhnmvzeigarirgswgwwwxrclihazx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391401.6952958-228-129197389219047/AnsiballZ_stat.py'
Oct 02 07:50:02 compute-0 sudo[78921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:02 compute-0 python3.9[78923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:02 compute-0 sudo[78921]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:02 compute-0 sudo[79044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhjljblyhfdmnkjnhocmpcyjatpikohg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391401.6952958-228-129197389219047/AnsiballZ_copy.py'
Oct 02 07:50:02 compute-0 sudo[79044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:02 compute-0 python3.9[79046]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391401.6952958-228-129197389219047/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=34a0494ebbebc2056909a34c2569f7f13b4e8756 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:02 compute-0 sudo[79044]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:03 compute-0 sudo[79196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufxuhsjdouevwnxioszehfvioktvxtdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391403.152998-317-264610370878423/AnsiballZ_file.py'
Oct 02 07:50:03 compute-0 sudo[79196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:03 compute-0 python3.9[79198]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:03 compute-0 sudo[79196]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:04 compute-0 sudo[79348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cccsscydntzdyxevqirykemxaarrckls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391403.794163-317-108701153978471/AnsiballZ_file.py'
Oct 02 07:50:04 compute-0 sudo[79348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:04 compute-0 python3.9[79350]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:04 compute-0 sudo[79348]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:04 compute-0 sudo[79500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnnihomwvjkuhzvenrkbqfpivtdmjrsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391404.5800972-348-191550737998515/AnsiballZ_stat.py'
Oct 02 07:50:04 compute-0 sudo[79500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:05 compute-0 python3.9[79502]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:05 compute-0 sudo[79500]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:05 compute-0 sudo[79623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvenpsdnywfeforditdbcgfhkukwnobk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391404.5800972-348-191550737998515/AnsiballZ_copy.py'
Oct 02 07:50:05 compute-0 sudo[79623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:05 compute-0 python3.9[79625]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391404.5800972-348-191550737998515/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=9da6eae0063215f3f61fa2ece49b8897bad7a8b5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:05 compute-0 sudo[79623]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:06 compute-0 sudo[79775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drjfziflgkyiixglrhyaaxmajlxuatxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391405.790653-348-204661841674433/AnsiballZ_stat.py'
Oct 02 07:50:06 compute-0 sudo[79775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:06 compute-0 python3.9[79777]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:06 compute-0 sudo[79775]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:06 compute-0 sudo[79898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hofuxnbthvtycmtomrdxhbcwibnuugwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391405.790653-348-204661841674433/AnsiballZ_copy.py'
Oct 02 07:50:06 compute-0 sudo[79898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:07 compute-0 python3.9[79900]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391405.790653-348-204661841674433/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=887cc920b041a7473dd86a6103437a610f1f353d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:07 compute-0 sudo[79898]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:07 compute-0 sudo[80050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baevekzvkypmezqanuohoncgzxswkkmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391407.159959-348-78538206748346/AnsiballZ_stat.py'
Oct 02 07:50:07 compute-0 sudo[80050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:07 compute-0 python3.9[80052]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:07 compute-0 sudo[80050]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:08 compute-0 sudo[80173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxpgvcgxslqemvrcdhqfbtplpkcxmdqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391407.159959-348-78538206748346/AnsiballZ_copy.py'
Oct 02 07:50:08 compute-0 sudo[80173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:08 compute-0 python3.9[80175]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391407.159959-348-78538206748346/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=cbed0ee72c7a06f1411ba1abc6663a9b1372332c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:08 compute-0 sudo[80173]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:08 compute-0 sudo[80325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydlgfdumwnaqtkpygegabximomhxmmas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391408.7275457-435-262511078450461/AnsiballZ_file.py'
Oct 02 07:50:08 compute-0 sudo[80325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:09 compute-0 python3.9[80327]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:09 compute-0 sudo[80325]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:09 compute-0 sudo[80477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yapzyogrbxgjsdjnfamhvssrclvnzjdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391409.389544-435-87590476778565/AnsiballZ_file.py'
Oct 02 07:50:09 compute-0 sudo[80477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:09 compute-0 python3.9[80479]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:09 compute-0 sudo[80477]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:10 compute-0 sudo[80629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxanxeynoyusvnpipqqhsfvkpoawjkdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391410.0798848-464-238167757661762/AnsiballZ_stat.py'
Oct 02 07:50:10 compute-0 sudo[80629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:10 compute-0 python3.9[80631]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:10 compute-0 sudo[80629]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:10 compute-0 sudo[80752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ougozhyrpltszlofhenwgckyzsrfqmdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391410.0798848-464-238167757661762/AnsiballZ_copy.py'
Oct 02 07:50:10 compute-0 sudo[80752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:11 compute-0 python3.9[80754]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391410.0798848-464-238167757661762/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=24e058d21dde6193cb2893e8485d556f8a7f621b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:11 compute-0 sudo[80752]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:11 compute-0 sudo[80904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhqgbwnucvyluhfobjcjxifxdnmjxaiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391411.2215476-464-237339033844246/AnsiballZ_stat.py'
Oct 02 07:50:11 compute-0 sudo[80904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:11 compute-0 python3.9[80906]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:11 compute-0 sudo[80904]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:11 compute-0 sudo[81027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhbqfmbnycdapynjpzsnvrvgvelbwehq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391411.2215476-464-237339033844246/AnsiballZ_copy.py'
Oct 02 07:50:11 compute-0 sudo[81027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:12 compute-0 python3.9[81029]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391411.2215476-464-237339033844246/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=887cc920b041a7473dd86a6103437a610f1f353d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:12 compute-0 sudo[81027]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:12 compute-0 sudo[81179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovdfachffsrkolnscamjbhegscuwimrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391412.3753679-464-167635681653963/AnsiballZ_stat.py'
Oct 02 07:50:12 compute-0 sudo[81179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:12 compute-0 python3.9[81181]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:12 compute-0 sudo[81179]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:13 compute-0 sudo[81302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeunlcgvlbnyobnxgrddzgjlwuxzzema ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391412.3753679-464-167635681653963/AnsiballZ_copy.py'
Oct 02 07:50:13 compute-0 sudo[81302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:13 compute-0 python3.9[81304]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391412.3753679-464-167635681653963/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=6d955500a4b515de1a96a3e8382f190b83333a76 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:13 compute-0 sudo[81302]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:14 compute-0 sudo[81454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcvtwhbsnmjocjiovzxbxmcofpmkixfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391414.2646518-584-105108162390436/AnsiballZ_file.py'
Oct 02 07:50:14 compute-0 sudo[81454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:14 compute-0 python3.9[81456]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:14 compute-0 sudo[81454]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:15 compute-0 sudo[81606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmqlowmdcihjwelpktvgsrxvwnljvadv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391414.984009-600-242276496675329/AnsiballZ_stat.py'
Oct 02 07:50:15 compute-0 sudo[81606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:15 compute-0 python3.9[81608]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:15 compute-0 sudo[81606]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:16 compute-0 sudo[81729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgtoyndagqnxxfhqxbipsloffxgtchbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391414.984009-600-242276496675329/AnsiballZ_copy.py'
Oct 02 07:50:16 compute-0 sudo[81729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:16 compute-0 python3.9[81731]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391414.984009-600-242276496675329/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eca61761ba526b6f995456bd5ca7bb1b26d84647 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:16 compute-0 sudo[81729]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:16 compute-0 sudo[81881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnkqxgdcfgcmnaknrqkueltcdmpwqzrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391416.4901593-631-47836893538992/AnsiballZ_file.py'
Oct 02 07:50:16 compute-0 sudo[81881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:17 compute-0 python3.9[81883]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:17 compute-0 sudo[81881]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:17 compute-0 sudo[82033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eauwqrfbhzxstrkzcklxhddiobinpivr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391417.277101-648-234553517720379/AnsiballZ_stat.py'
Oct 02 07:50:17 compute-0 sudo[82033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:17 compute-0 python3.9[82035]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:17 compute-0 sudo[82033]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:18 compute-0 sudo[82156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdzyueqnegzxcamiusnwbsrpzmqrcmyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391417.277101-648-234553517720379/AnsiballZ_copy.py'
Oct 02 07:50:18 compute-0 sudo[82156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:18 compute-0 python3.9[82158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391417.277101-648-234553517720379/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eca61761ba526b6f995456bd5ca7bb1b26d84647 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:18 compute-0 sudo[82156]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:18 compute-0 sudo[82308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujylirfkyjajquozsgskrgdxetbgttzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391418.6781151-680-175919100735532/AnsiballZ_file.py'
Oct 02 07:50:18 compute-0 sudo[82308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:19 compute-0 python3.9[82310]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:19 compute-0 sudo[82308]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:19 compute-0 sudo[82460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mowtasduyijpdfjlnczhzrzsqywuhvdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391419.410349-696-183355756802341/AnsiballZ_stat.py'
Oct 02 07:50:19 compute-0 sudo[82460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:20 compute-0 python3.9[82462]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:20 compute-0 sudo[82460]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:20 compute-0 sudo[82583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlutqcqgwftwabpwhvrlhoienqnrjjci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391419.410349-696-183355756802341/AnsiballZ_copy.py'
Oct 02 07:50:20 compute-0 sudo[82583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:20 compute-0 python3.9[82585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391419.410349-696-183355756802341/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eca61761ba526b6f995456bd5ca7bb1b26d84647 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:20 compute-0 sudo[82583]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:21 compute-0 sudo[82735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pibnubkfmsdbofkmltqbkzikdibwxntc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391420.8263106-727-56570103400855/AnsiballZ_file.py'
Oct 02 07:50:21 compute-0 sudo[82735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:21 compute-0 python3.9[82737]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:21 compute-0 sudo[82735]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:21 compute-0 sudo[82887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnjsxudiwxngsntlfbxmelezdjadokvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391421.5739956-744-203255280232700/AnsiballZ_stat.py'
Oct 02 07:50:21 compute-0 sudo[82887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:22 compute-0 python3.9[82889]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:22 compute-0 sudo[82887]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:22 compute-0 sudo[83010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjhmeuetbdbpqsfuqdrexxzmbgueqhoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391421.5739956-744-203255280232700/AnsiballZ_copy.py'
Oct 02 07:50:22 compute-0 sudo[83010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:22 compute-0 python3.9[83012]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391421.5739956-744-203255280232700/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eca61761ba526b6f995456bd5ca7bb1b26d84647 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:22 compute-0 sudo[83010]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:23 compute-0 sudo[83162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbsljlivtxbofdbspppslpscxksjvxje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391422.929183-777-101701632300100/AnsiballZ_file.py'
Oct 02 07:50:23 compute-0 sudo[83162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:23 compute-0 python3.9[83164]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:23 compute-0 sudo[83162]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:23 compute-0 sudo[83314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbgzvxombymawnncawsgxhcoqpuyyqyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391423.6742103-793-186675618521526/AnsiballZ_stat.py'
Oct 02 07:50:23 compute-0 sudo[83314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:24 compute-0 python3.9[83316]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:24 compute-0 sudo[83314]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:24 compute-0 sudo[83437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knhojrljzkolectbbwqcxhtfqkatuznl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391423.6742103-793-186675618521526/AnsiballZ_copy.py'
Oct 02 07:50:24 compute-0 sudo[83437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:24 compute-0 python3.9[83439]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391423.6742103-793-186675618521526/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eca61761ba526b6f995456bd5ca7bb1b26d84647 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:24 compute-0 sudo[83437]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:25 compute-0 sudo[83589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adoxwujrrrrqmbluqaqjberspdbhlzph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391425.0114186-825-147798561644165/AnsiballZ_file.py'
Oct 02 07:50:25 compute-0 sudo[83589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:25 compute-0 python3.9[83591]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:25 compute-0 sudo[83589]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:26 compute-0 sudo[83741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fybfhjiufilkghwucgptgbogityqkegc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391425.7680905-842-214901268432161/AnsiballZ_stat.py'
Oct 02 07:50:26 compute-0 sudo[83741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:26 compute-0 python3.9[83743]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:26 compute-0 sudo[83741]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:26 compute-0 sudo[83864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfoomnwbhiqbpdnovexlhxmnhdphthmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391425.7680905-842-214901268432161/AnsiballZ_copy.py'
Oct 02 07:50:26 compute-0 sudo[83864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:26 compute-0 python3.9[83866]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391425.7680905-842-214901268432161/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eca61761ba526b6f995456bd5ca7bb1b26d84647 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:27 compute-0 sudo[83864]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:27 compute-0 sudo[84016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mavucmbsgidvysspnktutnsaesikgion ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391427.2546182-875-62845954885513/AnsiballZ_file.py'
Oct 02 07:50:27 compute-0 sudo[84016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:27 compute-0 python3.9[84018]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:27 compute-0 sudo[84016]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:28 compute-0 sudo[84168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seilraqyuuznangnshwazdvxakwlkviu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391428.000845-893-17431886446883/AnsiballZ_stat.py'
Oct 02 07:50:28 compute-0 sudo[84168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:28 compute-0 python3.9[84170]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:28 compute-0 sudo[84168]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:29 compute-0 sudo[84291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkokfkfxbatcsuuxwikpeilgabxzditg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391428.000845-893-17431886446883/AnsiballZ_copy.py'
Oct 02 07:50:29 compute-0 sudo[84291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:29 compute-0 python3.9[84293]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391428.000845-893-17431886446883/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eca61761ba526b6f995456bd5ca7bb1b26d84647 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:29 compute-0 sudo[84291]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:29 compute-0 sshd-session[76634]: Connection closed by 192.168.122.30 port 37492
Oct 02 07:50:29 compute-0 sshd-session[76631]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:50:29 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Oct 02 07:50:29 compute-0 systemd[1]: session-20.scope: Consumed 32.900s CPU time.
Oct 02 07:50:29 compute-0 systemd-logind[827]: Session 20 logged out. Waiting for processes to exit.
Oct 02 07:50:29 compute-0 systemd-logind[827]: Removed session 20.
Oct 02 07:50:35 compute-0 sshd-session[84318]: Accepted publickey for zuul from 192.168.122.30 port 55516 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:50:35 compute-0 systemd-logind[827]: New session 21 of user zuul.
Oct 02 07:50:35 compute-0 systemd[1]: Started Session 21 of User zuul.
Oct 02 07:50:35 compute-0 sshd-session[84318]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:50:36 compute-0 python3.9[84471]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:50:37 compute-0 sudo[84625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqiezncoooeixhzzfklfhqjugorltjti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391437.1396947-48-233079379696531/AnsiballZ_file.py'
Oct 02 07:50:37 compute-0 sudo[84625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:37 compute-0 python3.9[84627]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:37 compute-0 sudo[84625]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:38 compute-0 sudo[84777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpmbpjougfoxqrfdekkfxwzzquvcpagk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391438.0530689-48-52343534653304/AnsiballZ_file.py'
Oct 02 07:50:38 compute-0 sudo[84777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:38 compute-0 python3.9[84779]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:50:38 compute-0 sudo[84777]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:39 compute-0 python3.9[84929]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:50:39 compute-0 PackageKit[31176]: daemon quit
Oct 02 07:50:39 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 02 07:50:40 compute-0 sudo[85079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfpbhltroumjbdyyptnewkhsfkiiwwyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391439.6160417-94-246545286879796/AnsiballZ_seboolean.py'
Oct 02 07:50:40 compute-0 sudo[85079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:40 compute-0 python3.9[85081]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 02 07:50:41 compute-0 sudo[85079]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:42 compute-0 sudo[85235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvgltqcxojralyjfqdzxhbgfdgncixjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391441.900471-114-7490198156866/AnsiballZ_setup.py'
Oct 02 07:50:42 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct 02 07:50:42 compute-0 sudo[85235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:42 compute-0 python3.9[85237]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:50:42 compute-0 sudo[85235]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:43 compute-0 sudo[85319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwwwdstivkqpcbecdckgqoiijerovcgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391441.900471-114-7490198156866/AnsiballZ_dnf.py'
Oct 02 07:50:43 compute-0 sudo[85319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:43 compute-0 python3.9[85321]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:50:44 compute-0 sudo[85319]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:45 compute-0 sudo[85472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrqqzswqccsupiorgzrnzwyjvgndysmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391445.0117645-138-126783824797599/AnsiballZ_systemd.py'
Oct 02 07:50:45 compute-0 sudo[85472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:46 compute-0 python3.9[85474]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 02 07:50:46 compute-0 sudo[85472]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:46 compute-0 sudo[85627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-legxcgaqexqfvdxanijinxgchuyfvwyw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759391446.386176-154-170642246595104/AnsiballZ_edpm_nftables_snippet.py'
Oct 02 07:50:46 compute-0 sudo[85627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:47 compute-0 python3[85629]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct 02 07:50:47 compute-0 sudo[85627]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:47 compute-0 sudo[85779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nszulogimpwfesnjnkezqgxhecetptxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391447.5133455-172-134481880299646/AnsiballZ_file.py'
Oct 02 07:50:47 compute-0 sudo[85779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:48 compute-0 python3.9[85781]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:48 compute-0 sudo[85779]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:48 compute-0 sudo[85931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahbtpyqdgueazktreavycxujctbjptda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391448.2756007-188-264051224970957/AnsiballZ_stat.py'
Oct 02 07:50:48 compute-0 sudo[85931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:49 compute-0 python3.9[85933]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:49 compute-0 sudo[85931]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:49 compute-0 sudo[86009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xewxvcyemrfzkzfsibqfewlqnljhaekx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391448.2756007-188-264051224970957/AnsiballZ_file.py'
Oct 02 07:50:49 compute-0 sudo[86009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:49 compute-0 python3.9[86011]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:49 compute-0 sudo[86009]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:50 compute-0 sudo[86161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebekxhaymoseonxcezwpynikvdwmktwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391449.8302097-212-186808463720266/AnsiballZ_stat.py'
Oct 02 07:50:50 compute-0 sudo[86161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:50 compute-0 python3.9[86163]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:50 compute-0 sudo[86161]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:50 compute-0 sudo[86239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfzadijwoajepuvqixpzdbvvsiduuyzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391449.8302097-212-186808463720266/AnsiballZ_file.py'
Oct 02 07:50:50 compute-0 sudo[86239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:50 compute-0 python3.9[86241]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.hrgnmxdj recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:50 compute-0 sudo[86239]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:51 compute-0 sudo[86391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtdidpiynxfluejlxykepirqklmmxyml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391451.0824194-236-202938513912973/AnsiballZ_stat.py'
Oct 02 07:50:51 compute-0 sudo[86391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:51 compute-0 python3.9[86393]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:51 compute-0 sudo[86391]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:51 compute-0 sudo[86469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkzaupyimakfwojjsolyruiiozisrlou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391451.0824194-236-202938513912973/AnsiballZ_file.py'
Oct 02 07:50:51 compute-0 sudo[86469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:52 compute-0 python3.9[86471]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:52 compute-0 sudo[86469]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:52 compute-0 sudo[86621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfpcuapolpcgpucbimoygfscmgtfgzle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391452.4278274-262-247456650991248/AnsiballZ_command.py'
Oct 02 07:50:52 compute-0 sudo[86621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:53 compute-0 python3.9[86623]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:50:53 compute-0 sudo[86621]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:53 compute-0 sudo[86774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvxsmcaslhlxvnsoejhbmuhsgbmmqsor ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759391453.4710422-278-120593154492305/AnsiballZ_edpm_nftables_from_files.py'
Oct 02 07:50:53 compute-0 sudo[86774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:54 compute-0 python3[86776]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 02 07:50:54 compute-0 sudo[86774]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:54 compute-0 sudo[86926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmldxllspxhyrucztyxcpwrjbjywcgti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391454.397707-294-160511687020314/AnsiballZ_stat.py'
Oct 02 07:50:54 compute-0 sudo[86926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:54 compute-0 python3.9[86928]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:54 compute-0 sudo[86926]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:55 compute-0 sudo[87051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnqkylanfcbwuibaraurqfpybvelcptt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391454.397707-294-160511687020314/AnsiballZ_copy.py'
Oct 02 07:50:55 compute-0 sudo[87051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:55 compute-0 python3.9[87053]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391454.397707-294-160511687020314/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:55 compute-0 sudo[87051]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:56 compute-0 sudo[87203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khowxklwcswwnpitrgbeksnviprecitf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391455.8654144-324-277391128936780/AnsiballZ_stat.py'
Oct 02 07:50:56 compute-0 sudo[87203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:56 compute-0 python3.9[87205]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:56 compute-0 sudo[87203]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:57 compute-0 sudo[87328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqbdpxiruydpwxiquhnrlvbtypujjhli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391455.8654144-324-277391128936780/AnsiballZ_copy.py'
Oct 02 07:50:57 compute-0 sudo[87328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:57 compute-0 python3.9[87330]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391455.8654144-324-277391128936780/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:57 compute-0 sudo[87328]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:57 compute-0 sudo[87480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfcnmfcjpfdqhgqheeiirxdnsbcejzfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391457.4541202-354-80075673035769/AnsiballZ_stat.py'
Oct 02 07:50:57 compute-0 sudo[87480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:58 compute-0 python3.9[87482]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:58 compute-0 sudo[87480]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:58 compute-0 sudo[87605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkwkzcwpxadvwweguqlwabntkhjhwsha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391457.4541202-354-80075673035769/AnsiballZ_copy.py'
Oct 02 07:50:58 compute-0 sudo[87605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:58 compute-0 python3.9[87607]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391457.4541202-354-80075673035769/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:50:58 compute-0 sudo[87605]: pam_unix(sudo:session): session closed for user root
Oct 02 07:50:59 compute-0 sudo[87757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chsvpetepocmdftvtbbquxvurhngjsns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391459.040608-384-81170125044848/AnsiballZ_stat.py'
Oct 02 07:50:59 compute-0 sudo[87757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:50:59 compute-0 python3.9[87759]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:50:59 compute-0 sudo[87757]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:00 compute-0 sudo[87882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ealitcywhzzgdabbzgaaqiqdvikkizpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391459.040608-384-81170125044848/AnsiballZ_copy.py'
Oct 02 07:51:00 compute-0 sudo[87882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:00 compute-0 python3.9[87884]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391459.040608-384-81170125044848/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:00 compute-0 sudo[87882]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:00 compute-0 sudo[88034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghruunmonnmkartfkepwvexdzeosnmzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391460.5418835-414-216889683862713/AnsiballZ_stat.py'
Oct 02 07:51:00 compute-0 sudo[88034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:01 compute-0 python3.9[88036]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:51:01 compute-0 sudo[88034]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:01 compute-0 sudo[88161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqhiiddnuruclghvrkhuxxfntroyjsbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391460.5418835-414-216889683862713/AnsiballZ_copy.py'
Oct 02 07:51:01 compute-0 sudo[88161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:01 compute-0 sshd-session[88037]: Received disconnect from 193.46.255.159 port 38394:11:  [preauth]
Oct 02 07:51:01 compute-0 sshd-session[88037]: Disconnected from authenticating user root 193.46.255.159 port 38394 [preauth]
Oct 02 07:51:01 compute-0 python3.9[88163]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391460.5418835-414-216889683862713/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:01 compute-0 sudo[88161]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:02 compute-0 sudo[88313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-namuvnebegwglbvlhwdhlezxugqhovbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391462.1584718-444-60459742926923/AnsiballZ_file.py'
Oct 02 07:51:02 compute-0 sudo[88313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:02 compute-0 python3.9[88315]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:02 compute-0 sudo[88313]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:03 compute-0 sudo[88465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvqittpabzmuhzddouiivpkpdavulyoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391463.024552-460-198736326543273/AnsiballZ_command.py'
Oct 02 07:51:03 compute-0 sudo[88465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:03 compute-0 python3.9[88467]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:51:03 compute-0 sudo[88465]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:04 compute-0 sudo[88620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhwctyhvkbkkdsaajvttjjmlsgnskwqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391463.784529-476-134210974165072/AnsiballZ_blockinfile.py'
Oct 02 07:51:04 compute-0 sudo[88620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:04 compute-0 python3.9[88622]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:04 compute-0 sudo[88620]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:05 compute-0 sudo[88772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuimpifxxvabnlcyaegrimniugvblvsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391464.7810504-494-190685499199036/AnsiballZ_command.py'
Oct 02 07:51:05 compute-0 sudo[88772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:05 compute-0 python3.9[88774]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:51:05 compute-0 sudo[88772]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:05 compute-0 sudo[88925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjqzitkwxqrgkrtxdslzjeyyjgqslxvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391465.5250816-510-244264013798098/AnsiballZ_stat.py'
Oct 02 07:51:05 compute-0 sudo[88925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:06 compute-0 python3.9[88927]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:51:06 compute-0 sudo[88925]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:06 compute-0 sudo[89079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppemykqebpnbvhvjhaajqnscombfqbbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391466.334588-526-178862282862093/AnsiballZ_command.py'
Oct 02 07:51:06 compute-0 sudo[89079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:06 compute-0 python3.9[89081]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:51:06 compute-0 sudo[89079]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:07 compute-0 sudo[89234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mohfkchoqaldjmhghqwucyuqdhkgnxks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391467.1046767-542-239505660880583/AnsiballZ_file.py'
Oct 02 07:51:07 compute-0 sudo[89234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:07 compute-0 python3.9[89236]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:07 compute-0 sudo[89234]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:08 compute-0 python3.9[89386]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:51:09 compute-0 sudo[89537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwtnjfeaeuudvzhialrvmjjvvrfexfdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391469.5818555-622-228078012089027/AnsiballZ_command.py'
Oct 02 07:51:09 compute-0 sudo[89537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:10 compute-0 python3.9[89539]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:d8:76:c8:90" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:51:10 compute-0 ovs-vsctl[89540]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:d8:76:c8:90 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct 02 07:51:10 compute-0 sudo[89537]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:10 compute-0 sudo[89690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pftlqkinptcgewatlmvjfxhkqxbqyvwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391470.4081547-640-159511552881380/AnsiballZ_command.py'
Oct 02 07:51:10 compute-0 sudo[89690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:10 compute-0 python3.9[89692]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:51:10 compute-0 sudo[89690]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:11 compute-0 sudo[89845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdrfunmsswylcwzcfdnzyulxcepbzepl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391471.1275663-656-95904328066664/AnsiballZ_command.py'
Oct 02 07:51:11 compute-0 sudo[89845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:11 compute-0 python3.9[89847]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:51:11 compute-0 ovs-vsctl[89848]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct 02 07:51:11 compute-0 sudo[89845]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:12 compute-0 python3.9[89998]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:51:13 compute-0 sudo[90150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsneigqffjptusohpqtaxrungvzzcghd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391472.6943016-690-205157956734760/AnsiballZ_file.py'
Oct 02 07:51:13 compute-0 sudo[90150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:13 compute-0 python3.9[90152]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:51:13 compute-0 sudo[90150]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:13 compute-0 sudo[90302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdkscfevsczmwxqkmxwavdkfwklifzqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391473.4626553-706-219408527521286/AnsiballZ_stat.py'
Oct 02 07:51:13 compute-0 sudo[90302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:13 compute-0 python3.9[90304]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:51:13 compute-0 sudo[90302]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:14 compute-0 sudo[90380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opphtgidtgtfqhuewyffcfrjtjvcmfjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391473.4626553-706-219408527521286/AnsiballZ_file.py'
Oct 02 07:51:14 compute-0 sudo[90380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:14 compute-0 python3.9[90382]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:51:14 compute-0 sudo[90380]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:14 compute-0 sudo[90532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epcgjqsofrbjwumgvqexugfcaxmjpxhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391474.6635144-706-252416292734467/AnsiballZ_stat.py'
Oct 02 07:51:14 compute-0 sudo[90532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:15 compute-0 python3.9[90534]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:51:15 compute-0 sudo[90532]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:15 compute-0 sudo[90610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agwyhfvapymzrnoqinhngtcmcbmhaqjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391474.6635144-706-252416292734467/AnsiballZ_file.py'
Oct 02 07:51:15 compute-0 sudo[90610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:15 compute-0 python3.9[90612]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:51:15 compute-0 sudo[90610]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:16 compute-0 sudo[90762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgcpcpxgyckqaldpldfjxzphrimbwwkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391475.9859555-752-35026657704661/AnsiballZ_file.py'
Oct 02 07:51:16 compute-0 sudo[90762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:16 compute-0 python3.9[90764]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:16 compute-0 sudo[90762]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:17 compute-0 sudo[90914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvaviqcuvdxunkwefzheeyjkfmzhcwqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391476.7489245-768-130757720392614/AnsiballZ_stat.py'
Oct 02 07:51:17 compute-0 sudo[90914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:17 compute-0 python3.9[90916]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:51:17 compute-0 sudo[90914]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:17 compute-0 sudo[90992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgmzpllldhaufoxibkcmaonduzmlijhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391476.7489245-768-130757720392614/AnsiballZ_file.py'
Oct 02 07:51:17 compute-0 sudo[90992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:17 compute-0 python3.9[90994]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:17 compute-0 sudo[90992]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:18 compute-0 sudo[91144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uilecevbmvqofnpmhebhxhdaxrikoffq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391477.9937346-792-22762827601193/AnsiballZ_stat.py'
Oct 02 07:51:18 compute-0 sudo[91144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:18 compute-0 python3.9[91146]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:51:18 compute-0 sudo[91144]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:18 compute-0 sudo[91222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgysrlbvxtvtjkescefhjudbfglusskt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391477.9937346-792-22762827601193/AnsiballZ_file.py'
Oct 02 07:51:18 compute-0 sudo[91222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:19 compute-0 python3.9[91224]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:19 compute-0 sudo[91222]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:19 compute-0 sudo[91374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqfqtmpdwuewrtciekysuduarsnuktik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391479.2843544-816-116132141994316/AnsiballZ_systemd.py'
Oct 02 07:51:19 compute-0 sudo[91374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:19 compute-0 python3.9[91376]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:51:19 compute-0 systemd[1]: Reloading.
Oct 02 07:51:19 compute-0 systemd-rc-local-generator[91398]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:51:19 compute-0 systemd-sysv-generator[91402]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:51:20 compute-0 sudo[91374]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:20 compute-0 sudo[91563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lboyppnrbbwrejlxixkhulwzwjvjucdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391480.4693694-832-270150407806532/AnsiballZ_stat.py'
Oct 02 07:51:20 compute-0 sudo[91563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:21 compute-0 python3.9[91565]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:51:21 compute-0 sudo[91563]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:21 compute-0 sudo[91641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlpmqurvxvjgifhbsvcdqdjshwwupkvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391480.4693694-832-270150407806532/AnsiballZ_file.py'
Oct 02 07:51:21 compute-0 sudo[91641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:21 compute-0 python3.9[91643]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:21 compute-0 sudo[91641]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:22 compute-0 sudo[91793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlhrjazfdncuamafeezosyldhltnqfqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391481.675381-856-202192653438126/AnsiballZ_stat.py'
Oct 02 07:51:22 compute-0 sudo[91793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:22 compute-0 python3.9[91795]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:51:22 compute-0 sudo[91793]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:22 compute-0 sudo[91871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbamhecndnianwlxttjtxcjjjajzylpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391481.675381-856-202192653438126/AnsiballZ_file.py'
Oct 02 07:51:22 compute-0 sudo[91871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:22 compute-0 python3.9[91873]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:22 compute-0 sudo[91871]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:23 compute-0 sudo[92023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knfqxmwowjpjnmtderhjkjrfcaqjqoov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391482.9587257-880-53456478628732/AnsiballZ_systemd.py'
Oct 02 07:51:23 compute-0 sudo[92023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:23 compute-0 python3.9[92025]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:51:23 compute-0 systemd[1]: Reloading.
Oct 02 07:51:23 compute-0 systemd-rc-local-generator[92048]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:51:23 compute-0 systemd-sysv-generator[92051]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:51:23 compute-0 systemd[1]: Starting Create netns directory...
Oct 02 07:51:23 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 02 07:51:23 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 02 07:51:23 compute-0 systemd[1]: Finished Create netns directory.
Oct 02 07:51:24 compute-0 sudo[92023]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:25 compute-0 sudo[92217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyozdqlsypqiweezexjceviphhxnigfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391485.2727146-900-639185358167/AnsiballZ_file.py'
Oct 02 07:51:25 compute-0 sudo[92217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:25 compute-0 python3.9[92219]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:51:25 compute-0 sudo[92217]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:26 compute-0 sudo[92369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcoacmsobezqymsqvtdicewdczljuxsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391486.124674-916-178713233047302/AnsiballZ_stat.py'
Oct 02 07:51:26 compute-0 sudo[92369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:26 compute-0 python3.9[92371]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:51:26 compute-0 sudo[92369]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:27 compute-0 sudo[92492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaurtgrqqutqrlboncmspwnlpovnkorq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391486.124674-916-178713233047302/AnsiballZ_copy.py'
Oct 02 07:51:27 compute-0 sudo[92492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:27 compute-0 python3.9[92494]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759391486.124674-916-178713233047302/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:51:27 compute-0 sudo[92492]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:28 compute-0 sudo[92644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcyfhhswpdrkkbnntmxpwxhzeinrxuor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391487.8702893-950-210137183550564/AnsiballZ_file.py'
Oct 02 07:51:28 compute-0 sudo[92644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:28 compute-0 python3.9[92646]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:51:28 compute-0 sudo[92644]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:29 compute-0 sudo[92796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpjrzjzcwnqmkuegiemeecfxgdsiqegw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391488.7281516-966-222505843182965/AnsiballZ_stat.py'
Oct 02 07:51:29 compute-0 sudo[92796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:29 compute-0 python3.9[92798]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:51:29 compute-0 sudo[92796]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:29 compute-0 sudo[92919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjmhcchpmwljvdizjtorpxptpijgshos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391488.7281516-966-222505843182965/AnsiballZ_copy.py'
Oct 02 07:51:29 compute-0 sudo[92919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:29 compute-0 python3.9[92921]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391488.7281516-966-222505843182965/.source.json _original_basename=.0dm_4zex follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:29 compute-0 sudo[92919]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:30 compute-0 sudo[93071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aylbyubfngmyfqfgjsqkcunvjycmvhxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391490.1484883-996-218945912209570/AnsiballZ_file.py'
Oct 02 07:51:30 compute-0 sudo[93071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:30 compute-0 python3.9[93073]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:30 compute-0 sudo[93071]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:31 compute-0 sudo[93223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wytezrxyaqluphhfyprwanvjrrqpfcbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391490.8978753-1012-122225702130341/AnsiballZ_stat.py'
Oct 02 07:51:31 compute-0 sudo[93223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:31 compute-0 sudo[93223]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:31 compute-0 sudo[93346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmpmdgnnscbbpqcjstlzeayzinphplpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391490.8978753-1012-122225702130341/AnsiballZ_copy.py'
Oct 02 07:51:31 compute-0 sudo[93346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:32 compute-0 sudo[93346]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:32 compute-0 sudo[93498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbhcnehjxqeerkuaifwkkdqbsdkgbafh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391492.3794231-1046-140704233800497/AnsiballZ_container_config_data.py'
Oct 02 07:51:32 compute-0 sudo[93498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:33 compute-0 python3.9[93500]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct 02 07:51:33 compute-0 sudo[93498]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:33 compute-0 sudo[93650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwflnxilswmhrafkgthqitjjxnttijnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391493.342203-1064-156358432068351/AnsiballZ_container_config_hash.py'
Oct 02 07:51:33 compute-0 sudo[93650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:34 compute-0 python3.9[93652]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 02 07:51:34 compute-0 sudo[93650]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:35 compute-0 sudo[93802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtjcpbdyezpeatesmcznqpdhfntxlpbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391494.4660144-1082-236472930280630/AnsiballZ_podman_container_info.py'
Oct 02 07:51:35 compute-0 sudo[93802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:35 compute-0 python3.9[93804]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 02 07:51:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:51:35 compute-0 sudo[93802]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:36 compute-0 sudo[93965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzkxjraniseucdtweqletnavlfafqzvs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759391495.8613138-1108-149427860178231/AnsiballZ_edpm_container_manage.py'
Oct 02 07:51:36 compute-0 sudo[93965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:36 compute-0 python3[93967]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 02 07:51:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:51:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:51:37 compute-0 podman[94003]: 2025-10-02 07:51:37.03120709 +0000 UTC m=+0.122796967 container create 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 02 07:51:37 compute-0 podman[94003]: 2025-10-02 07:51:36.945227906 +0000 UTC m=+0.036817833 image pull ceb6fcca0131acbc0ff37d5322c126e14f8045fca848e7440fedac2d6444d8c2 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct 02 07:51:37 compute-0 python3[93967]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct 02 07:51:37 compute-0 sudo[93965]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 02 07:51:37 compute-0 sudo[94191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrtqqkqupshulbyopcqjokrpkutwcvqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391497.411409-1124-267700522099007/AnsiballZ_stat.py'
Oct 02 07:51:37 compute-0 sudo[94191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:38 compute-0 python3.9[94193]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:51:38 compute-0 sudo[94191]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:38 compute-0 sudo[94345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffzhudixhdobeeugcdrdedjbmkceddvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391498.4043508-1142-92036572047992/AnsiballZ_file.py'
Oct 02 07:51:38 compute-0 sudo[94345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:38 compute-0 python3.9[94347]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:39 compute-0 sudo[94345]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:39 compute-0 sudo[94421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkkaljaibasrpxrzikzyedcpuepyrncy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391498.4043508-1142-92036572047992/AnsiballZ_stat.py'
Oct 02 07:51:39 compute-0 sudo[94421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:39 compute-0 python3.9[94423]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:51:39 compute-0 sudo[94421]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:40 compute-0 sudo[94572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eydgspgcaegeivzlwmclpwxbxjtnqttz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391499.5813575-1142-178809478621673/AnsiballZ_copy.py'
Oct 02 07:51:40 compute-0 sudo[94572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:40 compute-0 python3.9[94574]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759391499.5813575-1142-178809478621673/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:51:40 compute-0 sudo[94572]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:40 compute-0 sudo[94648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwmlrubecnucavnynjivhcfftgbazhbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391499.5813575-1142-178809478621673/AnsiballZ_systemd.py'
Oct 02 07:51:40 compute-0 sudo[94648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:40 compute-0 python3.9[94650]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 07:51:40 compute-0 systemd[1]: Reloading.
Oct 02 07:51:41 compute-0 systemd-rc-local-generator[94681]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:51:41 compute-0 systemd-sysv-generator[94685]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:51:41 compute-0 sudo[94648]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:41 compute-0 sudo[94762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shthpooagchorgmeechsmxuupgdbsdhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391499.5813575-1142-178809478621673/AnsiballZ_systemd.py'
Oct 02 07:51:41 compute-0 sudo[94762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:41 compute-0 python3.9[94764]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:51:41 compute-0 systemd[1]: Reloading.
Oct 02 07:51:41 compute-0 systemd-rc-local-generator[94796]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:51:41 compute-0 systemd-sysv-generator[94799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:51:42 compute-0 systemd[1]: Starting ovn_controller container...
Oct 02 07:51:42 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct 02 07:51:42 compute-0 systemd[1]: Started libcrun container.
Oct 02 07:51:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/438d5102d4125cfae2f3408f839b66a346397b3a93535e5f81be1d2790741e3c/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 02 07:51:42 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6.
Oct 02 07:51:42 compute-0 podman[94806]: 2025-10-02 07:51:42.228569377 +0000 UTC m=+0.141625913 container init 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Oct 02 07:51:42 compute-0 ovn_controller[94821]: + sudo -E kolla_set_configs
Oct 02 07:51:42 compute-0 podman[94806]: 2025-10-02 07:51:42.26418239 +0000 UTC m=+0.177238915 container start 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Oct 02 07:51:42 compute-0 edpm-start-podman-container[94806]: ovn_controller
Oct 02 07:51:42 compute-0 systemd[1]: Created slice User Slice of UID 0.
Oct 02 07:51:42 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 02 07:51:42 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 02 07:51:42 compute-0 systemd[1]: Starting User Manager for UID 0...
Oct 02 07:51:42 compute-0 systemd[94859]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Oct 02 07:51:42 compute-0 edpm-start-podman-container[94805]: Creating additional drop-in dependency for "ovn_controller" (7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6)
Oct 02 07:51:42 compute-0 podman[94828]: 2025-10-02 07:51:42.364272342 +0000 UTC m=+0.080229414 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 07:51:42 compute-0 systemd[1]: 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6-66d431109454cfa4.service: Main process exited, code=exited, status=1/FAILURE
Oct 02 07:51:42 compute-0 systemd[1]: 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6-66d431109454cfa4.service: Failed with result 'exit-code'.
Oct 02 07:51:42 compute-0 systemd[1]: Reloading.
Oct 02 07:51:42 compute-0 systemd-sysv-generator[94912]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:51:42 compute-0 systemd-rc-local-generator[94902]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:51:42 compute-0 systemd[94859]: Queued start job for default target Main User Target.
Oct 02 07:51:42 compute-0 systemd[94859]: Created slice User Application Slice.
Oct 02 07:51:42 compute-0 systemd[94859]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 02 07:51:42 compute-0 systemd[94859]: Started Daily Cleanup of User's Temporary Directories.
Oct 02 07:51:42 compute-0 systemd[94859]: Reached target Paths.
Oct 02 07:51:42 compute-0 systemd[94859]: Reached target Timers.
Oct 02 07:51:42 compute-0 systemd[94859]: Starting D-Bus User Message Bus Socket...
Oct 02 07:51:42 compute-0 systemd[94859]: Starting Create User's Volatile Files and Directories...
Oct 02 07:51:42 compute-0 systemd[94859]: Finished Create User's Volatile Files and Directories.
Oct 02 07:51:42 compute-0 systemd[94859]: Listening on D-Bus User Message Bus Socket.
Oct 02 07:51:42 compute-0 systemd[94859]: Reached target Sockets.
Oct 02 07:51:42 compute-0 systemd[94859]: Reached target Basic System.
Oct 02 07:51:42 compute-0 systemd[94859]: Reached target Main User Target.
Oct 02 07:51:42 compute-0 systemd[94859]: Startup finished in 133ms.
Oct 02 07:51:42 compute-0 systemd[1]: Started User Manager for UID 0.
Oct 02 07:51:42 compute-0 systemd[1]: Started ovn_controller container.
Oct 02 07:51:42 compute-0 systemd[1]: Started Session c1 of User root.
Oct 02 07:51:42 compute-0 sudo[94762]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:42 compute-0 ovn_controller[94821]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 02 07:51:42 compute-0 ovn_controller[94821]: INFO:__main__:Validating config file
Oct 02 07:51:42 compute-0 ovn_controller[94821]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 02 07:51:42 compute-0 ovn_controller[94821]: INFO:__main__:Writing out command to execute
Oct 02 07:51:42 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Oct 02 07:51:42 compute-0 ovn_controller[94821]: ++ cat /run_command
Oct 02 07:51:42 compute-0 ovn_controller[94821]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 02 07:51:42 compute-0 ovn_controller[94821]: + ARGS=
Oct 02 07:51:42 compute-0 ovn_controller[94821]: + sudo kolla_copy_cacerts
Oct 02 07:51:42 compute-0 systemd[1]: Started Session c2 of User root.
Oct 02 07:51:42 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Oct 02 07:51:42 compute-0 ovn_controller[94821]: + [[ ! -n '' ]]
Oct 02 07:51:42 compute-0 ovn_controller[94821]: + . kolla_extend_start
Oct 02 07:51:42 compute-0 ovn_controller[94821]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 02 07:51:42 compute-0 ovn_controller[94821]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct 02 07:51:42 compute-0 ovn_controller[94821]: + umask 0022
Oct 02 07:51:42 compute-0 ovn_controller[94821]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct 02 07:51:42 compute-0 NetworkManager[51654]: <info>  [1759391502.8325] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct 02 07:51:42 compute-0 NetworkManager[51654]: <info>  [1759391502.8330] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 02 07:51:42 compute-0 NetworkManager[51654]: <info>  [1759391502.8338] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Oct 02 07:51:42 compute-0 NetworkManager[51654]: <info>  [1759391502.8343] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Oct 02 07:51:42 compute-0 NetworkManager[51654]: <info>  [1759391502.8345] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 02 07:51:42 compute-0 kernel: br-int: entered promiscuous mode
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00022|main|INFO|OVS feature set changed, force recompute.
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 02 07:51:42 compute-0 ovn_controller[94821]: 2025-10-02T07:51:42Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 02 07:51:42 compute-0 NetworkManager[51654]: <info>  [1759391502.8586] manager: (ovn-1d4a02-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct 02 07:51:42 compute-0 systemd-udevd[94956]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 07:51:42 compute-0 systemd-udevd[94963]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 07:51:42 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Oct 02 07:51:42 compute-0 NetworkManager[51654]: <info>  [1759391502.8750] device (genev_sys_6081): carrier: link connected
Oct 02 07:51:42 compute-0 NetworkManager[51654]: <info>  [1759391502.8754] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Oct 02 07:51:42 compute-0 NetworkManager[51654]: <info>  [1759391502.9186] manager: (ovn-61f597-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Oct 02 07:51:43 compute-0 sudo[95085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtaqaqndbakhrubnvwibpjdojyluzpff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391502.8961747-1198-99296487244981/AnsiballZ_command.py'
Oct 02 07:51:43 compute-0 sudo[95085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:43 compute-0 python3.9[95087]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:51:43 compute-0 ovs-vsctl[95088]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct 02 07:51:43 compute-0 sudo[95085]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:44 compute-0 sudo[95238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aktatlbaamffslbqbqwuaiktywlnasav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391503.7408276-1214-215831403898512/AnsiballZ_command.py'
Oct 02 07:51:44 compute-0 sudo[95238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:44 compute-0 python3.9[95240]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:51:44 compute-0 ovs-vsctl[95242]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct 02 07:51:44 compute-0 sudo[95238]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:45 compute-0 sudo[95393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cimtqopqneiavyqzikscojxrtvrbfces ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391504.731961-1242-168437729725112/AnsiballZ_command.py'
Oct 02 07:51:45 compute-0 sudo[95393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:45 compute-0 python3.9[95395]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:51:45 compute-0 ovs-vsctl[95396]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct 02 07:51:45 compute-0 sudo[95393]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:45 compute-0 sshd-session[84321]: Connection closed by 192.168.122.30 port 55516
Oct 02 07:51:45 compute-0 sshd-session[84318]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:51:45 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Oct 02 07:51:45 compute-0 systemd[1]: session-21.scope: Consumed 51.253s CPU time.
Oct 02 07:51:45 compute-0 systemd-logind[827]: Session 21 logged out. Waiting for processes to exit.
Oct 02 07:51:45 compute-0 systemd-logind[827]: Removed session 21.
Oct 02 07:51:51 compute-0 sshd-session[95421]: Accepted publickey for zuul from 192.168.122.30 port 48876 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:51:51 compute-0 systemd-logind[827]: New session 23 of user zuul.
Oct 02 07:51:51 compute-0 systemd[1]: Started Session 23 of User zuul.
Oct 02 07:51:51 compute-0 sshd-session[95421]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:51:52 compute-0 python3.9[95574]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:51:52 compute-0 systemd[1]: Stopping User Manager for UID 0...
Oct 02 07:51:52 compute-0 systemd[94859]: Activating special unit Exit the Session...
Oct 02 07:51:52 compute-0 systemd[94859]: Stopped target Main User Target.
Oct 02 07:51:52 compute-0 systemd[94859]: Stopped target Basic System.
Oct 02 07:51:52 compute-0 systemd[94859]: Stopped target Paths.
Oct 02 07:51:52 compute-0 systemd[94859]: Stopped target Sockets.
Oct 02 07:51:52 compute-0 systemd[94859]: Stopped target Timers.
Oct 02 07:51:52 compute-0 systemd[94859]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 02 07:51:52 compute-0 systemd[94859]: Closed D-Bus User Message Bus Socket.
Oct 02 07:51:52 compute-0 systemd[94859]: Stopped Create User's Volatile Files and Directories.
Oct 02 07:51:52 compute-0 systemd[94859]: Removed slice User Application Slice.
Oct 02 07:51:52 compute-0 systemd[94859]: Reached target Shutdown.
Oct 02 07:51:52 compute-0 systemd[94859]: Finished Exit the Session.
Oct 02 07:51:52 compute-0 systemd[94859]: Reached target Exit the Session.
Oct 02 07:51:52 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Oct 02 07:51:52 compute-0 systemd[1]: Stopped User Manager for UID 0.
Oct 02 07:51:52 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 02 07:51:52 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 02 07:51:52 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 02 07:51:52 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 02 07:51:52 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Oct 02 07:51:53 compute-0 sudo[95729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bilzyhzleeefpojfbdguuthjjovsmxnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391513.27693-48-48410655208199/AnsiballZ_file.py'
Oct 02 07:51:53 compute-0 sudo[95729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:54 compute-0 python3.9[95731]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:51:54 compute-0 sudo[95729]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:54 compute-0 sudo[95881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zihsdrybdydsycppfigvigxaufdyhuge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391514.3324382-48-251106216450323/AnsiballZ_file.py'
Oct 02 07:51:54 compute-0 sudo[95881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:54 compute-0 python3.9[95883]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:51:54 compute-0 sudo[95881]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:55 compute-0 sudo[96033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcnqjqbxfgifdpcujabpwuvrynnovffr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391515.0558085-48-153931976560737/AnsiballZ_file.py'
Oct 02 07:51:55 compute-0 sudo[96033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:55 compute-0 python3.9[96035]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:51:55 compute-0 sudo[96033]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:56 compute-0 sudo[96185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qopgzzcmwknexnnqgbgxvygufttoarop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391515.802839-48-230142876197142/AnsiballZ_file.py'
Oct 02 07:51:56 compute-0 sudo[96185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:56 compute-0 python3.9[96187]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:51:56 compute-0 sudo[96185]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:56 compute-0 sudo[96337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-narueiuvxcjuxsygldziqkfjsvbtsusd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391516.5616364-48-37629180184745/AnsiballZ_file.py'
Oct 02 07:51:56 compute-0 sudo[96337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:57 compute-0 python3.9[96339]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:51:57 compute-0 sudo[96337]: pam_unix(sudo:session): session closed for user root
Oct 02 07:51:58 compute-0 python3.9[96489]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:51:58 compute-0 sudo[96639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiaixpyayrfptfvsjuckhfgdxnidjbhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391518.399902-136-276934426356346/AnsiballZ_seboolean.py'
Oct 02 07:51:58 compute-0 sudo[96639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:51:59 compute-0 python3.9[96641]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 02 07:51:59 compute-0 sudo[96639]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:00 compute-0 python3.9[96791]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:01 compute-0 python3.9[96913]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759391519.9171486-152-15527800741045/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:52:02 compute-0 python3.9[97063]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:02 compute-0 python3.9[97184]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759391521.61602-182-267497586883017/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:52:03 compute-0 sudo[97334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohcdwxsgclzjorenhrwyaulijqmwhgzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391523.269971-216-109116411174663/AnsiballZ_setup.py'
Oct 02 07:52:03 compute-0 sudo[97334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:03 compute-0 python3.9[97336]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:52:04 compute-0 sudo[97334]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:04 compute-0 sudo[97418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmcnyydhwolfdeybuahamijvjnedamrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391523.269971-216-109116411174663/AnsiballZ_dnf.py'
Oct 02 07:52:04 compute-0 sudo[97418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:05 compute-0 python3.9[97420]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:52:06 compute-0 sudo[97418]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:07 compute-0 sudo[97571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utjvempuucxdhynzstcngdrmzwlhjzcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391526.4915879-240-131310883505333/AnsiballZ_systemd.py'
Oct 02 07:52:07 compute-0 sudo[97571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:07 compute-0 python3.9[97573]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 02 07:52:07 compute-0 sudo[97571]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:08 compute-0 python3.9[97726]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:09 compute-0 python3.9[97847]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759391527.8754668-256-74799346359494/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:52:09 compute-0 python3.9[97997]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:10 compute-0 python3.9[98118]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759391529.283866-256-46167432282025/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:52:11 compute-0 python3.9[98268]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:12 compute-0 python3.9[98389]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759391531.334897-344-149204078136368/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:52:12 compute-0 ovn_controller[94821]: 2025-10-02T07:52:12Z|00025|memory|INFO|16896 kB peak resident set size after 30.0 seconds
Oct 02 07:52:12 compute-0 ovn_controller[94821]: 2025-10-02T07:52:12Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Oct 02 07:52:12 compute-0 podman[98513]: 2025-10-02 07:52:12.84877046 +0000 UTC m=+0.110231511 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct 02 07:52:12 compute-0 python3.9[98549]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:13 compute-0 python3.9[98684]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759391532.5116818-344-98300977076228/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:52:14 compute-0 python3.9[98834]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:52:14 compute-0 sudo[98986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jolqwmjgqyknkyvslfnuumbbligaunjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391534.5355012-420-276822489148236/AnsiballZ_file.py'
Oct 02 07:52:14 compute-0 sudo[98986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:15 compute-0 python3.9[98988]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:52:15 compute-0 sudo[98986]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:15 compute-0 sudo[99138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oetiksfgpvyizvtdtptgcwmtkkpbvicq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391535.372664-436-89065334021203/AnsiballZ_stat.py'
Oct 02 07:52:15 compute-0 sudo[99138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:15 compute-0 python3.9[99140]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:15 compute-0 sudo[99138]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:16 compute-0 sudo[99216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grrzvzxxqjefkfxkezvjfxcbpmoauqvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391535.372664-436-89065334021203/AnsiballZ_file.py'
Oct 02 07:52:16 compute-0 sudo[99216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:16 compute-0 python3.9[99218]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:52:16 compute-0 sudo[99216]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:17 compute-0 sudo[99368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkozxhugpyucileccptwuvlbvlazhmqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391536.655042-436-276301432318026/AnsiballZ_stat.py'
Oct 02 07:52:17 compute-0 sudo[99368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:17 compute-0 python3.9[99370]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:17 compute-0 sudo[99368]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:17 compute-0 sudo[99446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rujlataszrcjeypzfisqkoinaqrdjczb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391536.655042-436-276301432318026/AnsiballZ_file.py'
Oct 02 07:52:17 compute-0 sudo[99446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:17 compute-0 python3.9[99448]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:52:17 compute-0 sudo[99446]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:18 compute-0 sudo[99598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miolcpylyupveikeogkmxcskjyaovbll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391538.0593498-482-242853945992112/AnsiballZ_file.py'
Oct 02 07:52:18 compute-0 sudo[99598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:18 compute-0 python3.9[99600]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:52:18 compute-0 sudo[99598]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:19 compute-0 sudo[99750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sssbzwokepwvmvrbtahzzhyrugoabalh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391538.8245823-498-61351252472345/AnsiballZ_stat.py'
Oct 02 07:52:19 compute-0 sudo[99750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:19 compute-0 python3.9[99752]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:19 compute-0 sudo[99750]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:19 compute-0 sudo[99828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymyfwealsfyqnayiecsqwlqaonfzhqib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391538.8245823-498-61351252472345/AnsiballZ_file.py'
Oct 02 07:52:19 compute-0 sudo[99828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:19 compute-0 python3.9[99830]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:52:19 compute-0 sudo[99828]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:20 compute-0 sudo[99980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnhkqxmblueljmyzrasyfdzdotihggwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391540.1872737-522-100876369978388/AnsiballZ_stat.py'
Oct 02 07:52:20 compute-0 sudo[99980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:20 compute-0 python3.9[99982]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:20 compute-0 sudo[99980]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:21 compute-0 sudo[100058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rolyzesblnjphamagixuppfcnxtdrnua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391540.1872737-522-100876369978388/AnsiballZ_file.py'
Oct 02 07:52:21 compute-0 sudo[100058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:21 compute-0 python3.9[100060]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:52:21 compute-0 sudo[100058]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:21 compute-0 sudo[100210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqxartozzrnxmjkfagbetryjkvvnohxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391541.5678391-546-84170070915647/AnsiballZ_systemd.py'
Oct 02 07:52:21 compute-0 sudo[100210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:22 compute-0 python3.9[100212]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:52:22 compute-0 systemd[1]: Reloading.
Oct 02 07:52:22 compute-0 systemd-rc-local-generator[100231]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:52:22 compute-0 systemd-sysv-generator[100238]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:52:22 compute-0 sudo[100210]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:23 compute-0 sudo[100399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kefmldysfmxpfmdlyiiqcstceburrjxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391542.84713-562-206321796417758/AnsiballZ_stat.py'
Oct 02 07:52:23 compute-0 sudo[100399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:23 compute-0 python3.9[100401]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:23 compute-0 sudo[100399]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:23 compute-0 sudo[100477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kildtxkgpbtctqfxwpbfafvzeozedzgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391542.84713-562-206321796417758/AnsiballZ_file.py'
Oct 02 07:52:23 compute-0 sudo[100477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:23 compute-0 python3.9[100479]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:52:24 compute-0 sudo[100477]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:24 compute-0 sudo[100629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loibpsfzlomghninjyjzjhnfztqcdtlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391544.2154975-586-103879385708416/AnsiballZ_stat.py'
Oct 02 07:52:24 compute-0 sudo[100629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:24 compute-0 python3.9[100631]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:24 compute-0 sudo[100629]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:25 compute-0 sudo[100707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjqoyhtghheavttnblrxemzcpmdaifzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391544.2154975-586-103879385708416/AnsiballZ_file.py'
Oct 02 07:52:25 compute-0 sudo[100707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:25 compute-0 python3.9[100709]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:52:25 compute-0 sudo[100707]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:25 compute-0 sudo[100859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grgdjbkjvoxaztdueapasxyelvrrgrql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391545.4858673-610-270227858248521/AnsiballZ_systemd.py'
Oct 02 07:52:25 compute-0 sudo[100859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:26 compute-0 python3.9[100861]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:52:26 compute-0 systemd[1]: Reloading.
Oct 02 07:52:26 compute-0 systemd-rc-local-generator[100885]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:52:26 compute-0 systemd-sysv-generator[100890]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:52:26 compute-0 systemd[1]: Starting Create netns directory...
Oct 02 07:52:26 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 02 07:52:26 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 02 07:52:26 compute-0 systemd[1]: Finished Create netns directory.
Oct 02 07:52:26 compute-0 sudo[100859]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:27 compute-0 sudo[101052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loktifzdemivyxshjvpfxkbjsspnrkag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391546.8453906-630-21632429338721/AnsiballZ_file.py'
Oct 02 07:52:27 compute-0 sudo[101052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:27 compute-0 python3.9[101054]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:52:27 compute-0 sudo[101052]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:28 compute-0 sudo[101204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmotwxdfwnsxooccvttuqmpomojsiruj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391547.6656022-646-189518924700654/AnsiballZ_stat.py'
Oct 02 07:52:28 compute-0 sudo[101204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:28 compute-0 python3.9[101206]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:28 compute-0 sudo[101204]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:28 compute-0 sudo[101327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygdpijvfyugszjphbuhkmxgjcydyvexk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391547.6656022-646-189518924700654/AnsiballZ_copy.py'
Oct 02 07:52:28 compute-0 sudo[101327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:28 compute-0 python3.9[101329]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759391547.6656022-646-189518924700654/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:52:28 compute-0 sudo[101327]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:29 compute-0 sudo[101479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbuqtbbufivkxebqsuwkadfajbynexjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391549.4831533-680-170825812901269/AnsiballZ_file.py'
Oct 02 07:52:29 compute-0 sudo[101479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:30 compute-0 python3.9[101481]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:52:30 compute-0 sudo[101479]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:30 compute-0 sudo[101631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecoyvlufmazdrdzxmhdennvubcdmzynr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391550.3172069-696-192978779509540/AnsiballZ_stat.py'
Oct 02 07:52:30 compute-0 sudo[101631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:30 compute-0 python3.9[101633]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:52:30 compute-0 sudo[101631]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:31 compute-0 sudo[101754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bddneozzpjfzopumoudhifgvwnqtapxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391550.3172069-696-192978779509540/AnsiballZ_copy.py'
Oct 02 07:52:31 compute-0 sudo[101754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:31 compute-0 python3.9[101756]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391550.3172069-696-192978779509540/.source.json _original_basename=.s324ut8h follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:52:31 compute-0 sudo[101754]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:32 compute-0 sudo[101906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kallmaiqrlnsavhzwwgeqknibwoofuca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391551.7825208-726-111129241404181/AnsiballZ_file.py'
Oct 02 07:52:32 compute-0 sudo[101906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:32 compute-0 python3.9[101908]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:52:32 compute-0 sudo[101906]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:33 compute-0 sudo[102058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgzvjgplcnzftctijirlptetrgpmbijk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391552.6578605-742-82231296104458/AnsiballZ_stat.py'
Oct 02 07:52:33 compute-0 sudo[102058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:33 compute-0 sudo[102058]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:33 compute-0 sudo[102181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erfusbnyneukpzxskumkkzuojookkogh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391552.6578605-742-82231296104458/AnsiballZ_copy.py'
Oct 02 07:52:33 compute-0 sudo[102181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:33 compute-0 sudo[102181]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:34 compute-0 sudo[102333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lutnzglzzhwielmiqwszabyxkuktbdjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391554.237679-776-125619723501770/AnsiballZ_container_config_data.py'
Oct 02 07:52:34 compute-0 sudo[102333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:34 compute-0 python3.9[102335]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct 02 07:52:34 compute-0 sudo[102333]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:35 compute-0 sudo[102485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbtuhhfsygqcvadiudikylmgirwampuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391555.2023416-794-159635532232188/AnsiballZ_container_config_hash.py'
Oct 02 07:52:35 compute-0 sudo[102485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:35 compute-0 python3.9[102487]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 02 07:52:35 compute-0 sudo[102485]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:36 compute-0 sudo[102637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etgojzznotfouauovttbhrzkbapkiwuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391556.275713-812-57846047033808/AnsiballZ_podman_container_info.py'
Oct 02 07:52:36 compute-0 sudo[102637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:37 compute-0 python3.9[102639]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 02 07:52:37 compute-0 sudo[102637]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:38 compute-0 sudo[102815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqqgmarelensjghcibvmlgdppaypdzga ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759391557.8893242-838-98908899914161/AnsiballZ_edpm_container_manage.py'
Oct 02 07:52:38 compute-0 sudo[102815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:38 compute-0 python3[102817]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 02 07:52:38 compute-0 podman[102854]: 2025-10-02 07:52:38.945980583 +0000 UTC m=+0.051405744 container create 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct 02 07:52:38 compute-0 podman[102854]: 2025-10-02 07:52:38.920258466 +0000 UTC m=+0.025683677 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 07:52:38 compute-0 python3[102817]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 07:52:39 compute-0 sudo[102815]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:39 compute-0 sudo[103042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylljswnpiskkdvlbsdkybxfbdjqfgbwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391559.3549044-854-216823305522129/AnsiballZ_stat.py'
Oct 02 07:52:39 compute-0 sudo[103042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:39 compute-0 python3.9[103044]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:52:40 compute-0 sudo[103042]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:40 compute-0 sudo[103196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyzwotmrmqedzsjnyfucjenzdpmaoeyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391560.2587636-872-85978907272504/AnsiballZ_file.py'
Oct 02 07:52:40 compute-0 sudo[103196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:40 compute-0 python3.9[103198]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:52:40 compute-0 sudo[103196]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:41 compute-0 sudo[103272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkfgidzzvqvcfntnqsxjxyhbjggwhtfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391560.2587636-872-85978907272504/AnsiballZ_stat.py'
Oct 02 07:52:41 compute-0 sudo[103272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:41 compute-0 python3.9[103274]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:52:41 compute-0 sudo[103272]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:41 compute-0 sudo[103423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klukgqmzgibpnvwezwirsvyqzdckphpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391561.298914-872-83019927001777/AnsiballZ_copy.py'
Oct 02 07:52:41 compute-0 sudo[103423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:41 compute-0 python3.9[103425]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759391561.298914-872-83019927001777/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:52:42 compute-0 sudo[103423]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:42 compute-0 sudo[103499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnbqhhlgzqngegizrldgzudwytxgisoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391561.298914-872-83019927001777/AnsiballZ_systemd.py'
Oct 02 07:52:42 compute-0 sudo[103499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:42 compute-0 python3.9[103501]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 07:52:42 compute-0 systemd[1]: Reloading.
Oct 02 07:52:42 compute-0 systemd-rc-local-generator[103527]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:52:42 compute-0 systemd-sysv-generator[103534]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:52:42 compute-0 sudo[103499]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:43 compute-0 podman[103538]: 2025-10-02 07:52:43.101409036 +0000 UTC m=+0.159119419 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 07:52:43 compute-0 sudo[103637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlejwlagprmgcszlzlpqhletuxjwcrnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391561.298914-872-83019927001777/AnsiballZ_systemd.py'
Oct 02 07:52:43 compute-0 sudo[103637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:43 compute-0 python3.9[103639]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:52:43 compute-0 systemd[1]: Reloading.
Oct 02 07:52:43 compute-0 systemd-sysv-generator[103668]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:52:43 compute-0 systemd-rc-local-generator[103665]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:52:43 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Oct 02 07:52:44 compute-0 systemd[1]: Started libcrun container.
Oct 02 07:52:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c19ab42c9dfd0078f69af5c3fb7ea3bfab476f4becf5e8d74ce5e2c5f9825a5a/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct 02 07:52:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c19ab42c9dfd0078f69af5c3fb7ea3bfab476f4becf5e8d74ce5e2c5f9825a5a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 07:52:44 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa.
Oct 02 07:52:44 compute-0 podman[103681]: 2025-10-02 07:52:44.127910933 +0000 UTC m=+0.165210301 container init 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: + sudo -E kolla_set_configs
Oct 02 07:52:44 compute-0 podman[103681]: 2025-10-02 07:52:44.162396769 +0000 UTC m=+0.199696057 container start 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 02 07:52:44 compute-0 edpm-start-podman-container[103681]: ovn_metadata_agent
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Validating config file
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Copying service configuration files
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Writing out command to execute
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Setting permission for /var/lib/neutron
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: ++ cat /run_command
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: + CMD=neutron-ovn-metadata-agent
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: + ARGS=
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: + sudo kolla_copy_cacerts
Oct 02 07:52:44 compute-0 edpm-start-podman-container[103680]: Creating additional drop-in dependency for "ovn_metadata_agent" (66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa)
Oct 02 07:52:44 compute-0 podman[103705]: 2025-10-02 07:52:44.270348702 +0000 UTC m=+0.087586104 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: + [[ ! -n '' ]]
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: + . kolla_extend_start
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: Running command: 'neutron-ovn-metadata-agent'
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: + umask 0022
Oct 02 07:52:44 compute-0 ovn_metadata_agent[103698]: + exec neutron-ovn-metadata-agent
Oct 02 07:52:44 compute-0 systemd[1]: Reloading.
Oct 02 07:52:44 compute-0 systemd-rc-local-generator[103768]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:52:44 compute-0 systemd-sysv-generator[103771]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:52:44 compute-0 systemd[1]: Started ovn_metadata_agent container.
Oct 02 07:52:44 compute-0 sudo[103637]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:44 compute-0 sshd-session[95424]: Connection closed by 192.168.122.30 port 48876
Oct 02 07:52:44 compute-0 sshd-session[95421]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:52:44 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Oct 02 07:52:44 compute-0 systemd[1]: session-23.scope: Consumed 40.267s CPU time.
Oct 02 07:52:44 compute-0 systemd-logind[827]: Session 23 logged out. Waiting for processes to exit.
Oct 02 07:52:44 compute-0 systemd-logind[827]: Removed session 23.
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.903 103703 INFO neutron.common.config [-] Logging enabled!
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.903 103703 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.903 103703 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.904 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.904 103703 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.904 103703 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.904 103703 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.904 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.904 103703 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.905 103703 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.905 103703 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.905 103703 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.905 103703 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.905 103703 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.905 103703 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.905 103703 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.905 103703 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.906 103703 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.906 103703 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.906 103703 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.906 103703 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.906 103703 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.906 103703 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.906 103703 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.906 103703 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.906 103703 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.907 103703 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.907 103703 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.907 103703 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.907 103703 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.907 103703 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.907 103703 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.907 103703 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.907 103703 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.907 103703 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.907 103703 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.908 103703 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.908 103703 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.908 103703 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.908 103703 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.908 103703 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.908 103703 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.908 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.908 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.908 103703 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.909 103703 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.909 103703 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.909 103703 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.909 103703 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.909 103703 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.909 103703 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.909 103703 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.909 103703 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.909 103703 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.909 103703 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.910 103703 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.910 103703 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.910 103703 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.910 103703 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.910 103703 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.910 103703 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.910 103703 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.911 103703 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.911 103703 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.911 103703 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.911 103703 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.911 103703 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.911 103703 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.911 103703 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.911 103703 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.911 103703 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.912 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.912 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.912 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.912 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.912 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.912 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.912 103703 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.912 103703 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.912 103703 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.913 103703 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.913 103703 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.913 103703 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.913 103703 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.913 103703 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.913 103703 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.913 103703 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.913 103703 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.913 103703 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.913 103703 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.914 103703 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.914 103703 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.914 103703 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.914 103703 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.914 103703 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.914 103703 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.914 103703 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.914 103703 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.914 103703 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.914 103703 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.915 103703 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.915 103703 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.915 103703 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.915 103703 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.915 103703 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.915 103703 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.915 103703 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.916 103703 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.916 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.916 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.916 103703 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.916 103703 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.916 103703 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.916 103703 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.916 103703 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.917 103703 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.917 103703 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.917 103703 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.917 103703 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.917 103703 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.917 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.917 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.917 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.918 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.918 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.918 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.918 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.918 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.918 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.918 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.918 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.918 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.919 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.919 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.919 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.919 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.919 103703 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.919 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.919 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.919 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.919 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.919 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.920 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.920 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.920 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.920 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.920 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.920 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.920 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.920 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.920 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.921 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.921 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.921 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.921 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.921 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.921 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.921 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.921 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.921 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.921 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.922 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.922 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.922 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.922 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.922 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.922 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.922 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.922 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.922 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.923 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.923 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.923 103703 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.923 103703 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.923 103703 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.923 103703 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.923 103703 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.923 103703 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.923 103703 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.923 103703 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.924 103703 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.924 103703 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.924 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.924 103703 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.924 103703 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.924 103703 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.924 103703 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.924 103703 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.925 103703 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.925 103703 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.925 103703 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.925 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.925 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.925 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.925 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.925 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.926 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.926 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.926 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.926 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.926 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.926 103703 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.926 103703 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.926 103703 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.926 103703 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.926 103703 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.927 103703 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.927 103703 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.927 103703 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.927 103703 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.927 103703 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.927 103703 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.927 103703 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.927 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.927 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.928 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.928 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.928 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.928 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.928 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.928 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.928 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.928 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.928 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.928 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.929 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.929 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.929 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.929 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.929 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.929 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.929 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.929 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.929 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.929 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.930 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.930 103703 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.930 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.930 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.930 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.930 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.930 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.930 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.930 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.931 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.931 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.931 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.931 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.931 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.931 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.931 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.931 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.932 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.932 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.932 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.932 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.932 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.932 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.932 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.932 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.932 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.933 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.933 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.933 103703 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.933 103703 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.933 103703 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.933 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.933 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.933 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.933 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.934 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.934 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.934 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.934 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.934 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.934 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.934 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.934 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.935 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.935 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.935 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.935 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.935 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.935 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.935 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.935 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.935 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.936 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.936 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.936 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.936 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.936 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.936 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.936 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.936 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.936 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.937 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.937 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.937 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.937 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.937 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.937 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.937 103703 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.937 103703 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.947 103703 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.947 103703 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.947 103703 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.947 103703 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.948 103703 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.959 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 66c4bca3-98aa-4361-8801-8722dd9a7888 (UUID: 66c4bca3-98aa-4361-8801-8722dd9a7888) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.988 103703 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.988 103703 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.988 103703 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.989 103703 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 02 07:52:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:45.992 103703 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.000 103703 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.006 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '66c4bca3-98aa-4361-8801-8722dd9a7888'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], external_ids={}, name=66c4bca3-98aa-4361-8801-8722dd9a7888, nb_cfg_timestamp=1759391510854, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.007 103703 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fd4223dfa00>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.008 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.008 103703 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.008 103703 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.008 103703 INFO oslo_service.service [-] Starting 1 workers
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.012 103703 DEBUG oslo_service.service [-] Started child 103809 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.015 103703 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpfle_quq_/privsep.sock']
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.016 103809 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-361910'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.047 103809 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.047 103809 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.047 103809 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.051 103809 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.062 103809 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.074 103809 INFO eventlet.wsgi.server [-] (103809) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Oct 02 07:52:46 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.729 103703 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.731 103703 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpfle_quq_/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.588 103814 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.593 103814 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.595 103814 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.595 103814 INFO oslo.privsep.daemon [-] privsep daemon running as pid 103814
Oct 02 07:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:46.735 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[141999b8-d8ab-447a-8262-2f345c7fc02a]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.226 103814 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.226 103814 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.226 103814 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.729 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[3f010610-cbd1-40fd-8185-ce0c4ec26079]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.732 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, column=external_ids, values=({'neutron:ovn-metadata-id': '18813f7d-b706-5db6-a247-2c242c630f5e'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.741 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.748 103703 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.748 103703 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.748 103703 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.748 103703 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.749 103703 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.749 103703 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.749 103703 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.749 103703 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.750 103703 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.750 103703 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.750 103703 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.750 103703 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.751 103703 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.751 103703 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.751 103703 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.752 103703 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.752 103703 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.752 103703 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.752 103703 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.753 103703 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.753 103703 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.753 103703 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.753 103703 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.754 103703 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.754 103703 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.754 103703 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.755 103703 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.755 103703 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.755 103703 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.755 103703 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.756 103703 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.756 103703 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.756 103703 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.756 103703 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.757 103703 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.757 103703 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.757 103703 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.758 103703 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.758 103703 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.758 103703 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.758 103703 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.759 103703 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.759 103703 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.759 103703 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.759 103703 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.760 103703 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.760 103703 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.760 103703 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.760 103703 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.761 103703 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.761 103703 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.761 103703 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.761 103703 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.761 103703 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.762 103703 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.762 103703 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.762 103703 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.763 103703 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.763 103703 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.763 103703 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.763 103703 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.763 103703 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.764 103703 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.764 103703 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.764 103703 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.765 103703 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.765 103703 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.765 103703 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.765 103703 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.765 103703 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.766 103703 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.766 103703 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.766 103703 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.767 103703 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.767 103703 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.767 103703 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.767 103703 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.768 103703 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.768 103703 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.768 103703 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.769 103703 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.769 103703 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.769 103703 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.769 103703 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.770 103703 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.770 103703 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.770 103703 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.771 103703 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.771 103703 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.771 103703 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.771 103703 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.772 103703 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.772 103703 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.772 103703 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.772 103703 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.773 103703 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.773 103703 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.773 103703 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.773 103703 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.773 103703 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.774 103703 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.774 103703 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.774 103703 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.774 103703 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.775 103703 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.775 103703 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.775 103703 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.775 103703 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.776 103703 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.776 103703 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.776 103703 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.777 103703 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.777 103703 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.777 103703 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.777 103703 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.778 103703 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.778 103703 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.778 103703 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.778 103703 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.779 103703 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.779 103703 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.779 103703 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.780 103703 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.780 103703 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.780 103703 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.780 103703 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.781 103703 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.781 103703 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.781 103703 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.781 103703 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.782 103703 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.782 103703 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.782 103703 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.782 103703 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.782 103703 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.783 103703 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.783 103703 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.783 103703 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.784 103703 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.784 103703 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.784 103703 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.784 103703 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.784 103703 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.785 103703 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.785 103703 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.785 103703 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.785 103703 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.786 103703 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.786 103703 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.786 103703 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.786 103703 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.787 103703 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.787 103703 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.787 103703 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.787 103703 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.788 103703 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.788 103703 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.788 103703 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.788 103703 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.788 103703 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.788 103703 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.789 103703 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.789 103703 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.789 103703 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.789 103703 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.790 103703 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.790 103703 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.790 103703 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.790 103703 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.790 103703 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.791 103703 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.791 103703 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.791 103703 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.791 103703 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.792 103703 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.792 103703 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.792 103703 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.792 103703 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.792 103703 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.792 103703 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.793 103703 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.793 103703 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.793 103703 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.793 103703 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.793 103703 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.793 103703 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.794 103703 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.794 103703 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.794 103703 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.794 103703 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.794 103703 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.794 103703 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.794 103703 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.795 103703 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.795 103703 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.795 103703 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.795 103703 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.795 103703 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.795 103703 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.795 103703 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.796 103703 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.796 103703 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.796 103703 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.796 103703 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.796 103703 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.796 103703 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.796 103703 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.797 103703 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.797 103703 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.797 103703 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.797 103703 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.797 103703 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.797 103703 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.797 103703 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.797 103703 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.798 103703 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.798 103703 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.798 103703 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.798 103703 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.798 103703 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.798 103703 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.798 103703 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.798 103703 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.799 103703 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.799 103703 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.799 103703 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.799 103703 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.799 103703 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.799 103703 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.799 103703 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.800 103703 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.800 103703 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.800 103703 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.800 103703 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.800 103703 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.800 103703 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.800 103703 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.800 103703 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.801 103703 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.801 103703 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.801 103703 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.801 103703 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.801 103703 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.801 103703 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.801 103703 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.802 103703 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.802 103703 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.802 103703 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.802 103703 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.802 103703 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.802 103703 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.802 103703 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.802 103703 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.803 103703 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.803 103703 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.803 103703 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.803 103703 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.803 103703 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.803 103703 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.803 103703 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.804 103703 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.804 103703 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.804 103703 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.804 103703 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.804 103703 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.804 103703 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.804 103703 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.804 103703 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.805 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.805 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.805 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.805 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.805 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.805 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.806 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.806 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.806 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.806 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.806 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.806 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.806 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.806 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.807 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.807 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.807 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.807 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.807 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.807 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.807 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.808 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.808 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.808 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.808 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.808 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.808 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.808 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.809 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.809 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.809 103703 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.809 103703 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.809 103703 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.809 103703 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.810 103703 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 07:52:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:52:47.810 103703 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 02 07:52:50 compute-0 sshd-session[103819]: Accepted publickey for zuul from 192.168.122.30 port 52282 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:52:50 compute-0 systemd-logind[827]: New session 24 of user zuul.
Oct 02 07:52:50 compute-0 systemd[1]: Started Session 24 of User zuul.
Oct 02 07:52:50 compute-0 sshd-session[103819]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:52:51 compute-0 python3.9[103972]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:52:52 compute-0 sudo[104126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehofvrzoycgaaxozghqghrryttdzaalq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391571.89797-48-211677248552136/AnsiballZ_command.py'
Oct 02 07:52:52 compute-0 sudo[104126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:52 compute-0 python3.9[104128]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:52:52 compute-0 sudo[104126]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:53 compute-0 sudo[104291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sguhrirzlhaopbmohfnbkhpuiccksewa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391573.1603048-70-207987527037306/AnsiballZ_systemd_service.py'
Oct 02 07:52:53 compute-0 sudo[104291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:52:54 compute-0 python3.9[104293]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 07:52:54 compute-0 systemd[1]: Reloading.
Oct 02 07:52:54 compute-0 systemd-rc-local-generator[104316]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:52:54 compute-0 systemd-sysv-generator[104319]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:52:54 compute-0 sudo[104291]: pam_unix(sudo:session): session closed for user root
Oct 02 07:52:55 compute-0 python3.9[104477]: ansible-ansible.builtin.service_facts Invoked
Oct 02 07:52:55 compute-0 network[104494]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 02 07:52:55 compute-0 network[104495]: 'network-scripts' will be removed from distribution in near future.
Oct 02 07:52:55 compute-0 network[104496]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 02 07:52:59 compute-0 sudo[104758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzpijrmtaludyqdxtqhtucdrjpmvmbmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391579.5520194-108-230493957811548/AnsiballZ_systemd_service.py'
Oct 02 07:52:59 compute-0 sudo[104758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:00 compute-0 python3.9[104760]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:53:00 compute-0 sudo[104758]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:00 compute-0 sudo[104911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugcmvvlalkfglkwecaulwjnvwrnhlibc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391580.4986918-108-65679708759121/AnsiballZ_systemd_service.py'
Oct 02 07:53:00 compute-0 sudo[104911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:01 compute-0 python3.9[104913]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:53:01 compute-0 sudo[104911]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:01 compute-0 sudo[105064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-albvafdsbguuciyghnliysuyquticbqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391581.3181138-108-68232671550120/AnsiballZ_systemd_service.py'
Oct 02 07:53:01 compute-0 sudo[105064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:01 compute-0 python3.9[105066]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:53:02 compute-0 sudo[105064]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:02 compute-0 sudo[105217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brurvmrldugxbibeyvtkyogihtpijvef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391582.15887-108-230438122852699/AnsiballZ_systemd_service.py'
Oct 02 07:53:02 compute-0 sudo[105217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:02 compute-0 python3.9[105219]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:53:02 compute-0 sudo[105217]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:03 compute-0 sudo[105370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scnanncxuvgxmqjlwfigwkeoirkcvkom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391582.960714-108-220270153303636/AnsiballZ_systemd_service.py'
Oct 02 07:53:03 compute-0 sudo[105370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:03 compute-0 python3.9[105372]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:53:03 compute-0 sudo[105370]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:04 compute-0 sudo[105523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arnxdnaocjqyrbrsszcymdjxhhidgzpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391583.8057263-108-265904057366817/AnsiballZ_systemd_service.py'
Oct 02 07:53:04 compute-0 sudo[105523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:04 compute-0 python3.9[105525]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:53:04 compute-0 sudo[105523]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:04 compute-0 sudo[105676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqbqvrgmmqsoyaoonlazdxknhuziljbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391584.6202788-108-142975481359607/AnsiballZ_systemd_service.py'
Oct 02 07:53:04 compute-0 sudo[105676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:05 compute-0 python3.9[105678]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:53:05 compute-0 sudo[105676]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:06 compute-0 sudo[105829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqzbwbtfeydrkvjgkuestxfkstjvjrtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391585.8645446-212-135856145671531/AnsiballZ_file.py'
Oct 02 07:53:06 compute-0 sudo[105829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:06 compute-0 python3.9[105831]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:06 compute-0 sudo[105829]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:07 compute-0 sudo[105981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mobcnkzspmjwispszdnfmjzvqithfbic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391586.71246-212-259758170155056/AnsiballZ_file.py'
Oct 02 07:53:07 compute-0 sudo[105981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:07 compute-0 python3.9[105983]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:07 compute-0 sudo[105981]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:07 compute-0 sudo[106133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yymoudwttnioqncevecudqrtzlovnwww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391587.4998138-212-204883003095418/AnsiballZ_file.py'
Oct 02 07:53:07 compute-0 sudo[106133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:08 compute-0 python3.9[106135]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:08 compute-0 sudo[106133]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:08 compute-0 sudo[106285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgeuerhpsbmoybobyvcydyrcptnsrqlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391588.231197-212-225280293905432/AnsiballZ_file.py'
Oct 02 07:53:08 compute-0 sudo[106285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:08 compute-0 python3.9[106287]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:08 compute-0 sudo[106285]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:09 compute-0 sudo[106437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujwckgbirvmhzulqfqmxighgjtvuekbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391588.919164-212-263542510490289/AnsiballZ_file.py'
Oct 02 07:53:09 compute-0 sudo[106437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:09 compute-0 python3.9[106439]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:09 compute-0 sudo[106437]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:10 compute-0 sudo[106589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbtnpoubyhkvjstdvzkpkgnujwbtxcyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391589.695196-212-135501061340895/AnsiballZ_file.py'
Oct 02 07:53:10 compute-0 sudo[106589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:10 compute-0 python3.9[106591]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:10 compute-0 sudo[106589]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:10 compute-0 sudo[106741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdlfsdyxvpragtgmndtpgkrxftjmlkuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391590.4097333-212-228607742337994/AnsiballZ_file.py'
Oct 02 07:53:10 compute-0 sudo[106741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:10 compute-0 python3.9[106743]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:10 compute-0 sudo[106741]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:11 compute-0 sudo[106893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsptnqoctihhhugeqzzgqdvoikcqrbbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391591.151362-312-6700272916271/AnsiballZ_file.py'
Oct 02 07:53:11 compute-0 sudo[106893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:11 compute-0 python3.9[106895]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:11 compute-0 sudo[106893]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:12 compute-0 sudo[107045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sovqurfvrenzmvtqlcaubtxslxgwwequ ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391591.8027928-312-273892244204766/AnsiballZ_file.py'
Oct 02 07:53:12 compute-0 sudo[107045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:12 compute-0 python3.9[107047]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:12 compute-0 sudo[107045]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:12 compute-0 sudo[107197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frlvpupikgdxislbqnkhhburtvarrbwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391592.4282005-312-267326788620205/AnsiballZ_file.py'
Oct 02 07:53:12 compute-0 sudo[107197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:12 compute-0 python3.9[107199]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:12 compute-0 sudo[107197]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:13 compute-0 sudo[107359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgwczdnepjtezfnmtikcrkgsubzdrhjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391593.134871-312-204046689325882/AnsiballZ_file.py'
Oct 02 07:53:13 compute-0 sudo[107359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:13 compute-0 podman[107323]: 2025-10-02 07:53:13.526663538 +0000 UTC m=+0.133146885 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true)
Oct 02 07:53:13 compute-0 python3.9[107361]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:13 compute-0 sudo[107359]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:14 compute-0 sudo[107525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfjmsoklhjcyejbviocrqnozpdhrokqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391593.836736-312-182262861327759/AnsiballZ_file.py'
Oct 02 07:53:14 compute-0 sudo[107525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:14 compute-0 python3.9[107527]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:14 compute-0 sudo[107525]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:14 compute-0 sudo[107689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elffaespjnntdtoelzahbaciphbaemee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391594.4655616-312-142755870877401/AnsiballZ_file.py'
Oct 02 07:53:14 compute-0 sudo[107689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:14 compute-0 podman[107651]: 2025-10-02 07:53:14.8695235 +0000 UTC m=+0.083441638 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 02 07:53:15 compute-0 python3.9[107699]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:15 compute-0 sudo[107689]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:15 compute-0 sudo[107849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfkmpunpjkyxdiechvevjesmrizbwxbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391595.2837632-312-221962449326555/AnsiballZ_file.py'
Oct 02 07:53:15 compute-0 sudo[107849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:15 compute-0 python3.9[107851]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:53:15 compute-0 sudo[107849]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:16 compute-0 sudo[108001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fybzuchdneapmmuwrexzpohmjktktfqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391596.2768903-414-219654155191485/AnsiballZ_command.py'
Oct 02 07:53:16 compute-0 sudo[108001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:16 compute-0 python3.9[108003]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:53:16 compute-0 sudo[108001]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:17 compute-0 python3.9[108155]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 02 07:53:18 compute-0 sudo[108305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqjwldtbksqzumyqrvztfviklawfitcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391598.072192-450-153828448053830/AnsiballZ_systemd_service.py'
Oct 02 07:53:18 compute-0 sudo[108305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:18 compute-0 python3.9[108307]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 07:53:18 compute-0 systemd[1]: Reloading.
Oct 02 07:53:18 compute-0 systemd-rc-local-generator[108335]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:53:18 compute-0 systemd-sysv-generator[108338]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:53:19 compute-0 sudo[108305]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:19 compute-0 sudo[108492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufpewosfodtyiaaezgyfsaykxljzfpup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391599.2225358-466-50101268867359/AnsiballZ_command.py'
Oct 02 07:53:19 compute-0 sudo[108492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:19 compute-0 python3.9[108494]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:53:19 compute-0 sudo[108492]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:20 compute-0 sudo[108645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbvavjfouofgjergqmmerlivdsgdskhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391599.9839606-466-167875742231372/AnsiballZ_command.py'
Oct 02 07:53:20 compute-0 sudo[108645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:20 compute-0 python3.9[108647]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:53:20 compute-0 sudo[108645]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:21 compute-0 sudo[108798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnxwsvjqtmmhmyrsxhxyxxsmwriipeoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391600.7392044-466-217709149613986/AnsiballZ_command.py'
Oct 02 07:53:21 compute-0 sudo[108798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:21 compute-0 python3.9[108800]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:53:21 compute-0 sudo[108798]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:21 compute-0 sudo[108951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmjzombovpibshcvgrjpixtxckbognxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391601.5539808-466-130459497815773/AnsiballZ_command.py'
Oct 02 07:53:21 compute-0 sudo[108951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:22 compute-0 python3.9[108953]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:53:22 compute-0 sudo[108951]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:22 compute-0 sudo[109104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyxviazddsaftysvsdaxgogggftmrdqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391602.2074347-466-164036911457516/AnsiballZ_command.py'
Oct 02 07:53:22 compute-0 sudo[109104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:22 compute-0 python3.9[109106]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:53:22 compute-0 sudo[109104]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:23 compute-0 sudo[109257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqxhnlgqdkrmbzqeydvfnwvtnzkwpqiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391602.8381908-466-268950894845317/AnsiballZ_command.py'
Oct 02 07:53:23 compute-0 sudo[109257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:23 compute-0 python3.9[109259]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:53:23 compute-0 sudo[109257]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:23 compute-0 sudo[109410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqurgbiwmqawxsmmfakhnpwhnovusaih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391603.5688174-466-240886219243959/AnsiballZ_command.py'
Oct 02 07:53:23 compute-0 sudo[109410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:24 compute-0 python3.9[109412]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:53:24 compute-0 sudo[109410]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:25 compute-0 sudo[109563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opxrhoguafzxuxhkbmcaxpotfobhnzbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391604.6816013-574-164886921973737/AnsiballZ_getent.py'
Oct 02 07:53:25 compute-0 sudo[109563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:25 compute-0 python3.9[109565]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct 02 07:53:25 compute-0 sudo[109563]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:26 compute-0 sudo[109716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuchsjffuyoxjoaocxbsfdmxbvfzhjrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391605.7480223-590-2548216288483/AnsiballZ_group.py'
Oct 02 07:53:26 compute-0 sudo[109716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:26 compute-0 python3.9[109718]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 02 07:53:26 compute-0 groupadd[109719]: group added to /etc/group: name=libvirt, GID=42473
Oct 02 07:53:26 compute-0 groupadd[109719]: group added to /etc/gshadow: name=libvirt
Oct 02 07:53:26 compute-0 groupadd[109719]: new group: name=libvirt, GID=42473
Oct 02 07:53:26 compute-0 sudo[109716]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:27 compute-0 sudo[109874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhynymxpxvxxpouraylahvbwwujycmsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391606.8397105-606-242434844392429/AnsiballZ_user.py'
Oct 02 07:53:27 compute-0 sudo[109874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:27 compute-0 python3.9[109876]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 02 07:53:27 compute-0 useradd[109878]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Oct 02 07:53:27 compute-0 sudo[109874]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:28 compute-0 sudo[110034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waxqzgefxsfdwshmedavclvziayvmevz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391608.169791-628-74182123437442/AnsiballZ_setup.py'
Oct 02 07:53:28 compute-0 sudo[110034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:28 compute-0 python3.9[110036]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:53:29 compute-0 sudo[110034]: pam_unix(sudo:session): session closed for user root
Oct 02 07:53:29 compute-0 sudo[110118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqestdmavjkpmxjcvelwdekhtoccgsnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391608.169791-628-74182123437442/AnsiballZ_dnf.py'
Oct 02 07:53:29 compute-0 sudo[110118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:53:29 compute-0 python3.9[110120]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:53:44 compute-0 podman[110305]: 2025-10-02 07:53:44.222496668 +0000 UTC m=+0.122960183 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 02 07:53:45 compute-0 podman[110331]: 2025-10-02 07:53:45.170902915 +0000 UTC m=+0.080324596 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 02 07:53:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:53:45.949 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 07:53:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:53:45.950 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 07:53:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:53:45.950 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 07:53:57 compute-0 kernel: SELinux:  Converting 2752 SID table entries...
Oct 02 07:53:57 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 02 07:53:57 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 02 07:53:57 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 02 07:53:57 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 02 07:53:57 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 02 07:53:57 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 02 07:53:57 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 02 07:54:06 compute-0 kernel: SELinux:  Converting 2752 SID table entries...
Oct 02 07:54:07 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 02 07:54:07 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 02 07:54:07 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 02 07:54:07 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 02 07:54:07 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 02 07:54:07 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 02 07:54:07 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 02 07:54:15 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct 02 07:54:15 compute-0 podman[110371]: 2025-10-02 07:54:15.211124945 +0000 UTC m=+0.120075889 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller)
Oct 02 07:54:15 compute-0 podman[110395]: 2025-10-02 07:54:15.330596474 +0000 UTC m=+0.080697538 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 02 07:54:33 compute-0 sshd-session[117647]: Invalid user admin from 2.55.100.104 port 58480
Oct 02 07:54:33 compute-0 sshd-session[117647]: Connection closed by invalid user admin 2.55.100.104 port 58480 [preauth]
Oct 02 07:54:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:54:45.951 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 07:54:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:54:45.951 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 07:54:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:54:45.952 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 07:54:46 compute-0 podman[125315]: 2025-10-02 07:54:46.165781602 +0000 UTC m=+0.066029493 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Oct 02 07:54:46 compute-0 podman[125332]: 2025-10-02 07:54:46.17808332 +0000 UTC m=+0.084446793 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Oct 02 07:55:02 compute-0 kernel: SELinux:  Converting 2753 SID table entries...
Oct 02 07:55:02 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 02 07:55:02 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 02 07:55:02 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 02 07:55:02 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 02 07:55:02 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 02 07:55:02 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 02 07:55:02 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 02 07:55:03 compute-0 groupadd[127223]: group added to /etc/group: name=dnsmasq, GID=992
Oct 02 07:55:03 compute-0 groupadd[127223]: group added to /etc/gshadow: name=dnsmasq
Oct 02 07:55:03 compute-0 groupadd[127223]: new group: name=dnsmasq, GID=992
Oct 02 07:55:03 compute-0 useradd[127230]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Oct 02 07:55:03 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Oct 02 07:55:03 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct 02 07:55:03 compute-0 dbus-broker-launch[793]: Noticed file-system modification, trigger reload.
Oct 02 07:55:04 compute-0 groupadd[127243]: group added to /etc/group: name=clevis, GID=991
Oct 02 07:55:04 compute-0 groupadd[127243]: group added to /etc/gshadow: name=clevis
Oct 02 07:55:04 compute-0 groupadd[127243]: new group: name=clevis, GID=991
Oct 02 07:55:04 compute-0 useradd[127250]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Oct 02 07:55:04 compute-0 usermod[127260]: add 'clevis' to group 'tss'
Oct 02 07:55:04 compute-0 usermod[127260]: add 'clevis' to shadow group 'tss'
Oct 02 07:55:06 compute-0 polkitd[6272]: Reloading rules
Oct 02 07:55:06 compute-0 polkitd[6272]: Collecting garbage unconditionally...
Oct 02 07:55:06 compute-0 polkitd[6272]: Loading rules from directory /etc/polkit-1/rules.d
Oct 02 07:55:06 compute-0 polkitd[6272]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 02 07:55:06 compute-0 polkitd[6272]: Finished loading, compiling and executing 4 rules
Oct 02 07:55:06 compute-0 polkitd[6272]: Reloading rules
Oct 02 07:55:06 compute-0 polkitd[6272]: Collecting garbage unconditionally...
Oct 02 07:55:06 compute-0 polkitd[6272]: Loading rules from directory /etc/polkit-1/rules.d
Oct 02 07:55:06 compute-0 polkitd[6272]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 02 07:55:06 compute-0 polkitd[6272]: Finished loading, compiling and executing 4 rules
Oct 02 07:55:08 compute-0 groupadd[127447]: group added to /etc/group: name=ceph, GID=167
Oct 02 07:55:08 compute-0 groupadd[127447]: group added to /etc/gshadow: name=ceph
Oct 02 07:55:08 compute-0 groupadd[127447]: new group: name=ceph, GID=167
Oct 02 07:55:08 compute-0 useradd[127453]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Oct 02 07:55:11 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Oct 02 07:55:11 compute-0 sshd[1011]: Received signal 15; terminating.
Oct 02 07:55:11 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Oct 02 07:55:11 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Oct 02 07:55:11 compute-0 systemd[1]: sshd.service: Consumed 3.536s CPU time, read 0B from disk, written 8.0K to disk.
Oct 02 07:55:11 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Oct 02 07:55:11 compute-0 systemd[1]: Stopping sshd-keygen.target...
Oct 02 07:55:11 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 02 07:55:11 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 02 07:55:11 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 02 07:55:11 compute-0 systemd[1]: Reached target sshd-keygen.target.
Oct 02 07:55:11 compute-0 systemd[1]: Starting OpenSSH server daemon...
Oct 02 07:55:11 compute-0 sshd[127972]: Server listening on 0.0.0.0 port 22.
Oct 02 07:55:11 compute-0 sshd[127972]: Server listening on :: port 22.
Oct 02 07:55:11 compute-0 systemd[1]: Started OpenSSH server daemon.
Oct 02 07:55:14 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 02 07:55:14 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 02 07:55:14 compute-0 systemd[1]: Reloading.
Oct 02 07:55:14 compute-0 systemd-sysv-generator[128233]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:55:14 compute-0 systemd-rc-local-generator[128226]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:55:14 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 02 07:55:17 compute-0 podman[130665]: 2025-10-02 07:55:17.193039563 +0000 UTC m=+0.105715159 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct 02 07:55:17 compute-0 podman[130682]: 2025-10-02 07:55:17.226023137 +0000 UTC m=+0.125093402 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 07:55:17 compute-0 systemd[1]: Starting PackageKit Daemon...
Oct 02 07:55:17 compute-0 PackageKit[131059]: daemon start
Oct 02 07:55:17 compute-0 systemd[1]: Started PackageKit Daemon.
Oct 02 07:55:17 compute-0 sudo[110118]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:18 compute-0 sudo[132485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmsjlkopemvvzzsquasziubganhmfgub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391718.1605227-652-280978038013207/AnsiballZ_systemd.py'
Oct 02 07:55:18 compute-0 sudo[132485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:19 compute-0 python3.9[132511]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 02 07:55:19 compute-0 systemd[1]: Reloading.
Oct 02 07:55:19 compute-0 systemd-rc-local-generator[132874]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:55:19 compute-0 systemd-sysv-generator[132879]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:55:19 compute-0 sudo[132485]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:19 compute-0 sudo[133586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yntasmfihicybdfxdqwdbudqfvgbyobd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391719.6575515-652-37771053027993/AnsiballZ_systemd.py'
Oct 02 07:55:19 compute-0 sudo[133586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:20 compute-0 python3.9[133608]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 02 07:55:20 compute-0 systemd[1]: Reloading.
Oct 02 07:55:20 compute-0 systemd-rc-local-generator[134104]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:55:20 compute-0 systemd-sysv-generator[134107]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:55:20 compute-0 sudo[133586]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:21 compute-0 sudo[134745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnektcwbhbczrsxurfuutvsqrlioqvqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391720.8173423-652-76543570598177/AnsiballZ_systemd.py'
Oct 02 07:55:21 compute-0 sudo[134745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:21 compute-0 python3.9[134767]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 02 07:55:21 compute-0 systemd[1]: Reloading.
Oct 02 07:55:21 compute-0 systemd-rc-local-generator[135179]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:55:21 compute-0 systemd-sysv-generator[135186]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:55:21 compute-0 sudo[134745]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:22 compute-0 sudo[135934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyritdlkwndtvbupjihouifferasiadu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391722.0197375-652-131341662308911/AnsiballZ_systemd.py'
Oct 02 07:55:22 compute-0 sudo[135934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:22 compute-0 python3.9[135951]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 02 07:55:22 compute-0 systemd[1]: Reloading.
Oct 02 07:55:22 compute-0 systemd-rc-local-generator[136348]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:55:22 compute-0 systemd-sysv-generator[136354]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:55:23 compute-0 sudo[135934]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:23 compute-0 sudo[137060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxjhcaqumlrhhkirtvjgjggwpfulvdss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391723.175754-710-152966917258779/AnsiballZ_systemd.py'
Oct 02 07:55:23 compute-0 sudo[137060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:23 compute-0 python3.9[137076]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:23 compute-0 systemd[1]: Reloading.
Oct 02 07:55:23 compute-0 systemd-sysv-generator[137456]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:55:23 compute-0 systemd-rc-local-generator[137453]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:55:24 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 02 07:55:24 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 02 07:55:24 compute-0 systemd[1]: man-db-cache-update.service: Consumed 12.582s CPU time.
Oct 02 07:55:24 compute-0 systemd[1]: run-r2f833914f26741ee96c803294a10dc38.service: Deactivated successfully.
Oct 02 07:55:24 compute-0 sudo[137060]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:24 compute-0 sudo[137616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdvzckmbdabfkusbilqlkotnfgoxjlxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391724.364163-710-82837892062234/AnsiballZ_systemd.py'
Oct 02 07:55:24 compute-0 sudo[137616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:24 compute-0 python3.9[137618]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:25 compute-0 systemd[1]: Reloading.
Oct 02 07:55:25 compute-0 systemd-sysv-generator[137652]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:55:25 compute-0 systemd-rc-local-generator[137648]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:55:25 compute-0 sudo[137616]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:25 compute-0 sudo[137806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfsbrspfowxqivnxntgpswwhluaaskqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391725.4750764-710-82363588243926/AnsiballZ_systemd.py'
Oct 02 07:55:25 compute-0 sudo[137806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:26 compute-0 python3.9[137808]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:26 compute-0 systemd[1]: Reloading.
Oct 02 07:55:26 compute-0 systemd-sysv-generator[137845]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:55:26 compute-0 systemd-rc-local-generator[137841]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:55:26 compute-0 sudo[137806]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:27 compute-0 sudo[137997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzxhkfnoyycirglmpxihabdwpmafdfpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391726.7106748-710-255084278331180/AnsiballZ_systemd.py'
Oct 02 07:55:27 compute-0 sudo[137997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:27 compute-0 python3.9[137999]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:27 compute-0 sudo[137997]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:28 compute-0 sudo[138152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-novfmphbbscukqnromlhxiwbajkzebik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391727.6989934-710-195716627805191/AnsiballZ_systemd.py'
Oct 02 07:55:28 compute-0 sudo[138152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:28 compute-0 python3.9[138154]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:28 compute-0 systemd[1]: Reloading.
Oct 02 07:55:28 compute-0 systemd-sysv-generator[138186]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:55:28 compute-0 systemd-rc-local-generator[138182]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:55:28 compute-0 sudo[138152]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:29 compute-0 sudo[138342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqnwwabqcoxhxelttraoxybwdkfglwsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391729.0604439-782-196625356333261/AnsiballZ_systemd.py'
Oct 02 07:55:29 compute-0 sudo[138342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:29 compute-0 python3.9[138344]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 02 07:55:29 compute-0 systemd[1]: Reloading.
Oct 02 07:55:29 compute-0 systemd-sysv-generator[138378]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:55:29 compute-0 systemd-rc-local-generator[138375]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:55:30 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Oct 02 07:55:30 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct 02 07:55:30 compute-0 sudo[138342]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:30 compute-0 sudo[138535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxptvtrowkwzupeiglmljrqbjwjhnezy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391730.3921895-798-59739069365513/AnsiballZ_systemd.py'
Oct 02 07:55:30 compute-0 sudo[138535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:30 compute-0 python3.9[138537]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:31 compute-0 sudo[138535]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:31 compute-0 sudo[138690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgzcpjbviowqmggdeufwokgbvayhfpkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391731.2120674-798-61763178555695/AnsiballZ_systemd.py'
Oct 02 07:55:31 compute-0 sudo[138690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:31 compute-0 python3.9[138692]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:32 compute-0 sudo[138690]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:32 compute-0 sudo[138845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owktbustgtdgjaxucqfzhvlxaasggtzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391732.2307148-798-89985543941101/AnsiballZ_systemd.py'
Oct 02 07:55:32 compute-0 sudo[138845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:32 compute-0 python3.9[138847]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:33 compute-0 sudo[138845]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:33 compute-0 sudo[139000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjqelbzzysekclivglgqnmomzftjnqwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391733.1537418-798-214159579811256/AnsiballZ_systemd.py'
Oct 02 07:55:33 compute-0 sudo[139000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:33 compute-0 python3.9[139002]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:33 compute-0 sudo[139000]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:34 compute-0 sudo[139155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivgbpbdqwihzztzlqgtsvqismhmcncrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391734.096152-798-98764058619977/AnsiballZ_systemd.py'
Oct 02 07:55:34 compute-0 sudo[139155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:34 compute-0 python3.9[139157]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:34 compute-0 sudo[139155]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:35 compute-0 sudo[139310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flewneonamzzioyfjtigswexlwrromww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391734.9631493-798-229803594282688/AnsiballZ_systemd.py'
Oct 02 07:55:35 compute-0 sudo[139310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:35 compute-0 python3.9[139312]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:35 compute-0 sudo[139310]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:36 compute-0 sudo[139465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loxbwjlyxillsopgcbpjoloxsmvjmmmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391735.8335257-798-33987237192762/AnsiballZ_systemd.py'
Oct 02 07:55:36 compute-0 sudo[139465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:36 compute-0 python3.9[139467]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:36 compute-0 sudo[139465]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:37 compute-0 sudo[139620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiaeuenihnpgshidzdctotbltysxzpjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391736.7982113-798-203997147498663/AnsiballZ_systemd.py'
Oct 02 07:55:37 compute-0 sudo[139620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:37 compute-0 python3.9[139622]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:37 compute-0 sudo[139620]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:38 compute-0 sudo[139775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxoryldhqrvqsqxhyyclprfwjkyrolts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391737.7333357-798-33267155369032/AnsiballZ_systemd.py'
Oct 02 07:55:38 compute-0 sudo[139775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:38 compute-0 python3.9[139777]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:38 compute-0 sudo[139775]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:39 compute-0 sudo[139930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axbgrfqinxubpekvamhthapkvfgsfybk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391738.724007-798-280745612770039/AnsiballZ_systemd.py'
Oct 02 07:55:39 compute-0 sudo[139930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:39 compute-0 python3.9[139932]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:39 compute-0 sudo[139930]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:40 compute-0 sudo[140085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdplkofhhaoaidnkxvcnrtkksiludsig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391739.672667-798-73522477874161/AnsiballZ_systemd.py'
Oct 02 07:55:40 compute-0 sudo[140085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:40 compute-0 python3.9[140087]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:40 compute-0 sudo[140085]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:40 compute-0 sudo[140240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-semckfiwkgsjynqenpghyeoyefyzgisz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391740.5428324-798-255415051235910/AnsiballZ_systemd.py'
Oct 02 07:55:40 compute-0 sudo[140240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:41 compute-0 python3.9[140242]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:41 compute-0 sudo[140240]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:41 compute-0 sudo[140395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooxghigmbgdcfddsoknuumopmcreejmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391741.5366423-798-51733510555600/AnsiballZ_systemd.py'
Oct 02 07:55:41 compute-0 sudo[140395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:42 compute-0 python3.9[140397]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:42 compute-0 sudo[140395]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:42 compute-0 sudo[140550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfebilvxquyrzkkfxcnnyfcbbqzpumip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391742.4392653-798-189432165661150/AnsiballZ_systemd.py'
Oct 02 07:55:42 compute-0 sudo[140550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:43 compute-0 python3.9[140552]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 02 07:55:43 compute-0 sudo[140550]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:55:45.953 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 07:55:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:55:45.955 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 07:55:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:55:45.955 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 07:55:46 compute-0 sudo[140705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uabjnhssawnkzfokvgrwjkqomnbxujnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391746.61691-1002-96212389037194/AnsiballZ_file.py'
Oct 02 07:55:46 compute-0 sudo[140705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:47 compute-0 python3.9[140707]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:55:47 compute-0 sudo[140705]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:47 compute-0 sudo[140880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghnlweaxtyluljxrsqgwzbkmthvozxyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391747.2888825-1002-170953905672644/AnsiballZ_file.py'
Oct 02 07:55:47 compute-0 sudo[140880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:47 compute-0 podman[140831]: 2025-10-02 07:55:47.660227931 +0000 UTC m=+0.061867953 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 02 07:55:47 compute-0 podman[140832]: 2025-10-02 07:55:47.742461629 +0000 UTC m=+0.144241045 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Oct 02 07:55:47 compute-0 python3.9[140894]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:55:47 compute-0 sudo[140880]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:48 compute-0 sudo[141054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxsynwneqhbjimabovjmmovvseaahhqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391747.986883-1002-32837431144864/AnsiballZ_file.py'
Oct 02 07:55:48 compute-0 sudo[141054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:48 compute-0 python3.9[141056]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:55:48 compute-0 sudo[141054]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:49 compute-0 sudo[141206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kixqwqlzplsjsjfxpdfghggwxyqygjtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391748.7049606-1002-175429969846782/AnsiballZ_file.py'
Oct 02 07:55:49 compute-0 sudo[141206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:49 compute-0 python3.9[141208]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:55:49 compute-0 sudo[141206]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:49 compute-0 sudo[141358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sczfzqnlqlqwtjuidzcxpljuppczadbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391749.4148474-1002-76474656341066/AnsiballZ_file.py'
Oct 02 07:55:49 compute-0 sudo[141358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:49 compute-0 python3.9[141360]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:55:50 compute-0 sudo[141358]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:50 compute-0 sudo[141510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-binjshtlpbioasixwktlryhsxnugeyci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391750.197608-1002-19049339987199/AnsiballZ_file.py'
Oct 02 07:55:50 compute-0 sudo[141510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:50 compute-0 python3.9[141512]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:55:50 compute-0 sudo[141510]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:51 compute-0 sudo[141662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlhddpcqbwirwwkuavagrpwxpmjncqqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391751.2412016-1088-53040759374157/AnsiballZ_stat.py'
Oct 02 07:55:51 compute-0 sudo[141662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:51 compute-0 python3.9[141664]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:55:52 compute-0 sudo[141662]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:52 compute-0 sudo[141787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phkiorsjlscoynmfpryexocaeopakurj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391751.2412016-1088-53040759374157/AnsiballZ_copy.py'
Oct 02 07:55:52 compute-0 sudo[141787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:52 compute-0 python3.9[141789]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759391751.2412016-1088-53040759374157/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:55:52 compute-0 sudo[141787]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:53 compute-0 sudo[141939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxaceptmdpkcbzasqdeqclmlosvkscsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391753.0281203-1088-194330336187570/AnsiballZ_stat.py'
Oct 02 07:55:53 compute-0 sudo[141939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:53 compute-0 python3.9[141941]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:55:53 compute-0 sudo[141939]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:53 compute-0 sudo[142064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aweyzefddokonethcgsocgsilunabmyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391753.0281203-1088-194330336187570/AnsiballZ_copy.py'
Oct 02 07:55:53 compute-0 sudo[142064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:54 compute-0 python3.9[142066]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759391753.0281203-1088-194330336187570/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:55:54 compute-0 sudo[142064]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:54 compute-0 sudo[142216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asqcyoplwcqaotdiunabmozzmohfuyfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391754.3076468-1088-84367146665512/AnsiballZ_stat.py'
Oct 02 07:55:54 compute-0 sudo[142216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:54 compute-0 python3.9[142218]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:55:54 compute-0 sudo[142216]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:55 compute-0 sudo[142341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bisgrdmlliqppebrmmhnwjdgrghtephy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391754.3076468-1088-84367146665512/AnsiballZ_copy.py'
Oct 02 07:55:55 compute-0 sudo[142341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:55 compute-0 python3.9[142343]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759391754.3076468-1088-84367146665512/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:55:55 compute-0 sudo[142341]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:56 compute-0 sudo[142493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qweytjmgvjluavosjqzpgzesyduvdxwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391755.714186-1088-110830933677849/AnsiballZ_stat.py'
Oct 02 07:55:56 compute-0 sudo[142493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:56 compute-0 python3.9[142495]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:55:56 compute-0 sudo[142493]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:56 compute-0 sudo[142618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfzhnwocbejiohducxgvlbjovoadlncu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391755.714186-1088-110830933677849/AnsiballZ_copy.py'
Oct 02 07:55:56 compute-0 sudo[142618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:56 compute-0 python3.9[142620]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759391755.714186-1088-110830933677849/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:55:57 compute-0 sudo[142618]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:57 compute-0 sudo[142770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plkbumwwitthdenzxdfkcrlbykqolltz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391757.2897456-1088-26229221934886/AnsiballZ_stat.py'
Oct 02 07:55:57 compute-0 sudo[142770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:57 compute-0 python3.9[142772]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:55:57 compute-0 sudo[142770]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:58 compute-0 sudo[142895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trwnqoazbvqfabwoiustzhbsrjyacuel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391757.2897456-1088-26229221934886/AnsiballZ_copy.py'
Oct 02 07:55:58 compute-0 sudo[142895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:58 compute-0 python3.9[142897]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759391757.2897456-1088-26229221934886/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:55:58 compute-0 sudo[142895]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:59 compute-0 sudo[143047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyagtfopjkojpznywsuginicbcbcvmfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391758.775226-1088-101282874769192/AnsiballZ_stat.py'
Oct 02 07:55:59 compute-0 sudo[143047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:55:59 compute-0 python3.9[143049]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:55:59 compute-0 sudo[143047]: pam_unix(sudo:session): session closed for user root
Oct 02 07:55:59 compute-0 sudo[143172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnjyayawkspwygvheaebrtuxekgnahah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391758.775226-1088-101282874769192/AnsiballZ_copy.py'
Oct 02 07:55:59 compute-0 sudo[143172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:00 compute-0 python3.9[143174]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759391758.775226-1088-101282874769192/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:00 compute-0 sudo[143172]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:00 compute-0 sudo[143324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wghblijdrfkyubmqxzjusqmcqpljlyrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391760.2625122-1088-88448179011535/AnsiballZ_stat.py'
Oct 02 07:56:00 compute-0 sudo[143324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:00 compute-0 python3.9[143326]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:00 compute-0 sudo[143324]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:01 compute-0 sudo[143447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jehbmpcyskfdqohsljrltqdxtkzppufw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391760.2625122-1088-88448179011535/AnsiballZ_copy.py'
Oct 02 07:56:01 compute-0 sudo[143447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:01 compute-0 python3.9[143449]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759391760.2625122-1088-88448179011535/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:01 compute-0 sudo[143447]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:02 compute-0 sudo[143599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gquaeqyosinntxctyfhhqorroxskbreb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391761.556105-1088-159642610761490/AnsiballZ_stat.py'
Oct 02 07:56:02 compute-0 sudo[143599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:02 compute-0 python3.9[143601]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:02 compute-0 sudo[143599]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:02 compute-0 sudo[143724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aozhfguxealvjnvcleiapzqrqravjpto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391761.556105-1088-159642610761490/AnsiballZ_copy.py'
Oct 02 07:56:02 compute-0 sudo[143724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:02 compute-0 python3.9[143726]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759391761.556105-1088-159642610761490/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:02 compute-0 sudo[143724]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:03 compute-0 sudo[143876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muvplljgbmpnsujgjmfmiqnpyipkrkcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391763.0893526-1314-83926262405935/AnsiballZ_command.py'
Oct 02 07:56:03 compute-0 sudo[143876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:03 compute-0 python3.9[143878]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct 02 07:56:03 compute-0 sudo[143876]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:04 compute-0 sudo[144029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbxacfdxnvkqtkcfkdgfrapifkjfwdws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391763.8969905-1332-195295759324822/AnsiballZ_file.py'
Oct 02 07:56:04 compute-0 sudo[144029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:04 compute-0 python3.9[144031]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:04 compute-0 sudo[144029]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:04 compute-0 sudo[144181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xclqkbzjnczgmkarwrisdeuhjaojiltb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391764.6546001-1332-11886714717800/AnsiballZ_file.py'
Oct 02 07:56:04 compute-0 sudo[144181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:05 compute-0 python3.9[144183]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:05 compute-0 sudo[144181]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:05 compute-0 sudo[144333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwsahrdwsopyysxdtkkxmohexdsgjvuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391765.3133826-1332-141742926817631/AnsiballZ_file.py'
Oct 02 07:56:05 compute-0 sudo[144333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:05 compute-0 python3.9[144335]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:05 compute-0 sudo[144333]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:06 compute-0 sudo[144485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfvcbfdqtogtzvndyjqyuxqrpsvxnlad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391766.073124-1332-57752498675988/AnsiballZ_file.py'
Oct 02 07:56:06 compute-0 sudo[144485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:06 compute-0 python3.9[144487]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:06 compute-0 sudo[144485]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:07 compute-0 sudo[144637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwahiyptzopgjleiwbmanzjubtrglqgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391766.826129-1332-58093913688686/AnsiballZ_file.py'
Oct 02 07:56:07 compute-0 sudo[144637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:07 compute-0 python3.9[144639]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:07 compute-0 sudo[144637]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:08 compute-0 sudo[144789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqssvfmdhytnabubrvhxxdssjwykzpsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391767.651082-1332-168455953121588/AnsiballZ_file.py'
Oct 02 07:56:08 compute-0 sudo[144789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:08 compute-0 python3.9[144791]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:08 compute-0 sudo[144789]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:08 compute-0 sudo[144941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jadvgabaiqobbwskotthfamcqkqoeluw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391768.4835122-1332-273005336728790/AnsiballZ_file.py'
Oct 02 07:56:08 compute-0 sudo[144941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:09 compute-0 python3.9[144943]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:09 compute-0 sudo[144941]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:09 compute-0 sudo[145093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvmcxfeojnothbypcyeraiahopvmyzxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391769.237072-1332-168033222840900/AnsiballZ_file.py'
Oct 02 07:56:09 compute-0 sudo[145093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:09 compute-0 python3.9[145095]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:09 compute-0 sudo[145093]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:10 compute-0 sudo[145245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iodnhczmxnxsipedcuehwkcwmiosdodr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391770.0065544-1332-129372325507058/AnsiballZ_file.py'
Oct 02 07:56:10 compute-0 sudo[145245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:10 compute-0 python3.9[145247]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:10 compute-0 sudo[145245]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:11 compute-0 sudo[145397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlevpkxfyhcjelqhxnbzzqyadwdqtdtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391770.7347908-1332-129511961148339/AnsiballZ_file.py'
Oct 02 07:56:11 compute-0 sudo[145397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:11 compute-0 python3.9[145399]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:11 compute-0 sudo[145397]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:11 compute-0 sudo[145549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjxuhaltxeekzpiwkdcaavvijjzzuqzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391771.3637965-1332-274017013448531/AnsiballZ_file.py'
Oct 02 07:56:11 compute-0 sudo[145549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:11 compute-0 python3.9[145551]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:11 compute-0 sudo[145549]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:12 compute-0 sudo[145701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgutcdlvqafqjpzefxecckzelszkbfpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391772.08502-1332-145969678767584/AnsiballZ_file.py'
Oct 02 07:56:12 compute-0 sudo[145701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:12 compute-0 python3.9[145703]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:12 compute-0 sudo[145701]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:13 compute-0 sudo[145853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uogunobynhxeupgxhlqyiqwroexvvrqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391772.743478-1332-142594016626198/AnsiballZ_file.py'
Oct 02 07:56:13 compute-0 sudo[145853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:13 compute-0 python3.9[145855]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:13 compute-0 sudo[145853]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:13 compute-0 sudo[146005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvknbcgexsxwjzhhebprwbwjpowkkilt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391773.4766266-1332-128696172740790/AnsiballZ_file.py'
Oct 02 07:56:13 compute-0 sudo[146005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:14 compute-0 python3.9[146007]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:14 compute-0 sudo[146005]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:14 compute-0 sudo[146157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rymhthktxqeentunmiuwmsbzlvdkjeaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391774.2978444-1530-111168528065105/AnsiballZ_stat.py'
Oct 02 07:56:14 compute-0 sudo[146157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:14 compute-0 python3.9[146159]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:14 compute-0 sudo[146157]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:15 compute-0 sudo[146280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqouijhqtxcfcounvrkwiwygislwaxlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391774.2978444-1530-111168528065105/AnsiballZ_copy.py'
Oct 02 07:56:15 compute-0 sudo[146280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:15 compute-0 python3.9[146282]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391774.2978444-1530-111168528065105/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:15 compute-0 sudo[146280]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:15 compute-0 sudo[146432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noinmwrjemloshmszrhqqyldbnkjdnbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391775.68268-1530-79615955087255/AnsiballZ_stat.py'
Oct 02 07:56:15 compute-0 sudo[146432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:16 compute-0 python3.9[146434]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:16 compute-0 sudo[146432]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:16 compute-0 sudo[146555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixsgxcjevxlwacfhuxzsmhyabxluyntr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391775.68268-1530-79615955087255/AnsiballZ_copy.py'
Oct 02 07:56:16 compute-0 sudo[146555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:16 compute-0 python3.9[146557]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391775.68268-1530-79615955087255/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:16 compute-0 sudo[146555]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:17 compute-0 sudo[146707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xublsybgdatebhbpohtnvgswsfwutxfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391776.8954322-1530-125283234567910/AnsiballZ_stat.py'
Oct 02 07:56:17 compute-0 sudo[146707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:17 compute-0 python3.9[146709]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:17 compute-0 sudo[146707]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:17 compute-0 sudo[146842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cohklamlzzrogknfefvbyvqjptbvtxkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391776.8954322-1530-125283234567910/AnsiballZ_copy.py'
Oct 02 07:56:17 compute-0 sudo[146842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:17 compute-0 podman[146804]: 2025-10-02 07:56:17.833266116 +0000 UTC m=+0.084565221 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 02 07:56:17 compute-0 podman[146848]: 2025-10-02 07:56:17.940125183 +0000 UTC m=+0.102505361 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 07:56:18 compute-0 python3.9[146851]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391776.8954322-1530-125283234567910/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:18 compute-0 sudo[146842]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:18 compute-0 sudo[147026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngiyqjpobdtijkgopeynyzxpdzjumdhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391778.189211-1530-131840559509669/AnsiballZ_stat.py'
Oct 02 07:56:18 compute-0 sudo[147026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:18 compute-0 python3.9[147028]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:18 compute-0 sudo[147026]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:19 compute-0 sudo[147149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nltnqbsicjrpvjgetogqqgonyqgaunnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391778.189211-1530-131840559509669/AnsiballZ_copy.py'
Oct 02 07:56:19 compute-0 sudo[147149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:19 compute-0 python3.9[147151]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391778.189211-1530-131840559509669/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:19 compute-0 sudo[147149]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:20 compute-0 sudo[147301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mirgeoitfbtxvtbysadwwwflrdjqbgth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391779.725724-1530-141897710862175/AnsiballZ_stat.py'
Oct 02 07:56:20 compute-0 sudo[147301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:20 compute-0 python3.9[147303]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:20 compute-0 sudo[147301]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:20 compute-0 sudo[147424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gliyuedhxtpdqbwmjaffbxmcsehncbci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391779.725724-1530-141897710862175/AnsiballZ_copy.py'
Oct 02 07:56:20 compute-0 sudo[147424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:20 compute-0 python3.9[147426]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391779.725724-1530-141897710862175/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:20 compute-0 sudo[147424]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:21 compute-0 sudo[147576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kizdxkxfrhpksaihiwuzqbqtnebtbxjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391781.0591214-1530-269596575986142/AnsiballZ_stat.py'
Oct 02 07:56:21 compute-0 sudo[147576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:21 compute-0 python3.9[147578]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:21 compute-0 sudo[147576]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:22 compute-0 sudo[147699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqmnjdgccifmalahetquveewggcfscho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391781.0591214-1530-269596575986142/AnsiballZ_copy.py'
Oct 02 07:56:22 compute-0 sudo[147699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:22 compute-0 python3.9[147701]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391781.0591214-1530-269596575986142/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:22 compute-0 sudo[147699]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:22 compute-0 sudo[147851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfprmuloxqhhjnuyjjtlneieilpetbop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391782.4665265-1530-14376530806002/AnsiballZ_stat.py'
Oct 02 07:56:22 compute-0 sudo[147851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:23 compute-0 python3.9[147853]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:23 compute-0 sudo[147851]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:23 compute-0 sudo[147974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hghjokdpkdurtosqtfhcqvgbszlsybjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391782.4665265-1530-14376530806002/AnsiballZ_copy.py'
Oct 02 07:56:23 compute-0 sudo[147974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:23 compute-0 python3.9[147976]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391782.4665265-1530-14376530806002/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:23 compute-0 sudo[147974]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:24 compute-0 sudo[148126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnndsyfaetlzcmwktqjsfqbgkkaxyipk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391783.9613452-1530-153351648878972/AnsiballZ_stat.py'
Oct 02 07:56:24 compute-0 sudo[148126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:24 compute-0 python3.9[148128]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:24 compute-0 sudo[148126]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:24 compute-0 sudo[148249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luixvihckkdjelvejlxjrrtvvrczozhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391783.9613452-1530-153351648878972/AnsiballZ_copy.py'
Oct 02 07:56:24 compute-0 sudo[148249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:25 compute-0 python3.9[148251]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391783.9613452-1530-153351648878972/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:25 compute-0 sudo[148249]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:25 compute-0 sudo[148401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obkurblwznpiydneznuyemcxgfeabetu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391785.2823646-1530-234115248828469/AnsiballZ_stat.py'
Oct 02 07:56:25 compute-0 sudo[148401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:25 compute-0 python3.9[148403]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:25 compute-0 sudo[148401]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:26 compute-0 sudo[148524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xllbrzgkolbdombcuszwijaoaimxcnhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391785.2823646-1530-234115248828469/AnsiballZ_copy.py'
Oct 02 07:56:26 compute-0 sudo[148524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:26 compute-0 python3.9[148526]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391785.2823646-1530-234115248828469/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:26 compute-0 sudo[148524]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:27 compute-0 sudo[148676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlzvqygbbmoliwrvgsfaoerjyotpxwum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391786.797391-1530-131439939739942/AnsiballZ_stat.py'
Oct 02 07:56:27 compute-0 sudo[148676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:27 compute-0 python3.9[148678]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:27 compute-0 sudo[148676]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:27 compute-0 sudo[148799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbbtrtislxpzlqrfbfapayxvkojciyhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391786.797391-1530-131439939739942/AnsiballZ_copy.py'
Oct 02 07:56:27 compute-0 sudo[148799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:28 compute-0 python3.9[148801]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391786.797391-1530-131439939739942/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:28 compute-0 sudo[148799]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:28 compute-0 sudo[148951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrfzzmzwswhwnczrchrvtsjamhekjkrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391788.3376667-1530-65094304372253/AnsiballZ_stat.py'
Oct 02 07:56:28 compute-0 sudo[148951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:28 compute-0 python3.9[148953]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:29 compute-0 sudo[148951]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:29 compute-0 sudo[149074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bncprsswkbncympodqctvzihmibwbrbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391788.3376667-1530-65094304372253/AnsiballZ_copy.py'
Oct 02 07:56:29 compute-0 sudo[149074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:29 compute-0 python3.9[149076]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391788.3376667-1530-65094304372253/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:29 compute-0 sudo[149074]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:30 compute-0 sudo[149226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxhkkagnwieytmhrytobpgamsrhbznla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391789.8958342-1530-27062232732196/AnsiballZ_stat.py'
Oct 02 07:56:30 compute-0 sudo[149226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:30 compute-0 python3.9[149228]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:30 compute-0 sudo[149226]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:30 compute-0 sudo[149349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzgsacxtrbfqygpoxfhcmykeccsmzkrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391789.8958342-1530-27062232732196/AnsiballZ_copy.py'
Oct 02 07:56:30 compute-0 sudo[149349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:31 compute-0 python3.9[149351]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391789.8958342-1530-27062232732196/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:31 compute-0 sudo[149349]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:31 compute-0 sudo[149501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqzyinpwfqppfgifujgeqxxmhatmlxsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391791.3025792-1530-198800271815437/AnsiballZ_stat.py'
Oct 02 07:56:31 compute-0 sudo[149501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:31 compute-0 python3.9[149503]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:31 compute-0 sudo[149501]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:32 compute-0 sudo[149624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuvjcmpysrmuhprabrsxugyhmkqgbbbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391791.3025792-1530-198800271815437/AnsiballZ_copy.py'
Oct 02 07:56:32 compute-0 sudo[149624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:32 compute-0 python3.9[149626]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391791.3025792-1530-198800271815437/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:32 compute-0 sudo[149624]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:33 compute-0 sudo[149776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqdmpvqccbombomrykgacwcgslantrcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391792.764549-1530-233454367624053/AnsiballZ_stat.py'
Oct 02 07:56:33 compute-0 sudo[149776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:33 compute-0 python3.9[149778]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:33 compute-0 sudo[149776]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:33 compute-0 sudo[149899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztceyyhxlqusblelzplpxshpoohlltze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391792.764549-1530-233454367624053/AnsiballZ_copy.py'
Oct 02 07:56:33 compute-0 sudo[149899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:33 compute-0 python3.9[149901]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391792.764549-1530-233454367624053/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:34 compute-0 sudo[149899]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:35 compute-0 python3.9[150051]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:56:35 compute-0 sudo[150204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abtxsdcsihrwyensqjpkghaluabfquag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391795.3827455-1942-29106501263919/AnsiballZ_seboolean.py'
Oct 02 07:56:35 compute-0 sudo[150204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:36 compute-0 python3.9[150206]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct 02 07:56:37 compute-0 sudo[150204]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:38 compute-0 sudo[150360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twnqoyuigefpmjjknywjadcrjuxxohnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391797.6455863-1958-35642948923922/AnsiballZ_copy.py'
Oct 02 07:56:38 compute-0 dbus-broker-launch[811]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Oct 02 07:56:38 compute-0 sudo[150360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:38 compute-0 python3.9[150362]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:38 compute-0 sudo[150360]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:38 compute-0 sudo[150512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sopsqhbvthkvfwcxjitgbgyhbdoivvcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391798.4506361-1958-155075498203755/AnsiballZ_copy.py'
Oct 02 07:56:38 compute-0 sudo[150512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:39 compute-0 python3.9[150514]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:39 compute-0 sudo[150512]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:39 compute-0 sudo[150664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuovafyvpkzugldczqioagpouufqttqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391799.2117662-1958-21153600354344/AnsiballZ_copy.py'
Oct 02 07:56:39 compute-0 sudo[150664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:39 compute-0 python3.9[150666]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:39 compute-0 sudo[150664]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:40 compute-0 sudo[150816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdrvlewpzbwkllrzbofktedfrlibrreq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391799.9440596-1958-51941415727429/AnsiballZ_copy.py'
Oct 02 07:56:40 compute-0 sudo[150816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:40 compute-0 python3.9[150818]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:40 compute-0 sudo[150816]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:41 compute-0 sudo[150968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsmawcpsrbkjvsdgawbqxzmwveuuobqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391800.9111397-1958-37732990058351/AnsiballZ_copy.py'
Oct 02 07:56:41 compute-0 sudo[150968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:41 compute-0 python3.9[150970]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:41 compute-0 sudo[150968]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:42 compute-0 sudo[151120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpzrvqezvpfofmzukistbbjukuwbscjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391801.7246273-2030-47487957087487/AnsiballZ_copy.py'
Oct 02 07:56:42 compute-0 sudo[151120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:42 compute-0 python3.9[151122]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:42 compute-0 sudo[151120]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:42 compute-0 sudo[151272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjktydxsdkmiflyftrfipoznqvdhttvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391802.3835099-2030-133321645948165/AnsiballZ_copy.py'
Oct 02 07:56:42 compute-0 sudo[151272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:42 compute-0 python3.9[151274]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:42 compute-0 sudo[151272]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:43 compute-0 sudo[151424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgmpbcabikqnkiscntpxsktipvgqcgfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391803.056416-2030-178333167392376/AnsiballZ_copy.py'
Oct 02 07:56:43 compute-0 sudo[151424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:43 compute-0 python3.9[151426]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:43 compute-0 sudo[151424]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:44 compute-0 sudo[151576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrvtpbzfzvbqhdwtodeazeqegafayjum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391803.7643964-2030-106488006365758/AnsiballZ_copy.py'
Oct 02 07:56:44 compute-0 sudo[151576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:44 compute-0 python3.9[151578]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:44 compute-0 sudo[151576]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:44 compute-0 sudo[151728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqmxzpphveotjdlnoqyjarrsarrkjhja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391804.4576607-2030-179808392297042/AnsiballZ_copy.py'
Oct 02 07:56:44 compute-0 sudo[151728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:45 compute-0 python3.9[151730]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:45 compute-0 sudo[151728]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:45 compute-0 sudo[151880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuzuktlzhasthsqtcoypfmjthmkasbgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391805.3557956-2102-153015256300647/AnsiballZ_systemd.py'
Oct 02 07:56:45 compute-0 sudo[151880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:56:45.954 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 07:56:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:56:45.955 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 07:56:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:56:45.955 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 07:56:46 compute-0 python3.9[151882]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:56:46 compute-0 systemd[1]: Reloading.
Oct 02 07:56:46 compute-0 systemd-rc-local-generator[151909]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:56:46 compute-0 systemd-sysv-generator[151913]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:56:46 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Oct 02 07:56:46 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Oct 02 07:56:46 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Oct 02 07:56:46 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct 02 07:56:46 compute-0 systemd[1]: Starting libvirt logging daemon...
Oct 02 07:56:46 compute-0 systemd[1]: Started libvirt logging daemon.
Oct 02 07:56:46 compute-0 sudo[151880]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:47 compute-0 sudo[152073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvlgxlraoygtwmftqlnihyhowybekrmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391806.8559134-2102-233140312021675/AnsiballZ_systemd.py'
Oct 02 07:56:47 compute-0 sudo[152073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:47 compute-0 python3.9[152075]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:56:47 compute-0 systemd[1]: Reloading.
Oct 02 07:56:47 compute-0 systemd-rc-local-generator[152101]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:56:47 compute-0 systemd-sysv-generator[152106]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:56:47 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Oct 02 07:56:47 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct 02 07:56:47 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct 02 07:56:47 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct 02 07:56:47 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct 02 07:56:47 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct 02 07:56:47 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Oct 02 07:56:47 compute-0 podman[152111]: 2025-10-02 07:56:47.962380871 +0000 UTC m=+0.065957707 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct 02 07:56:47 compute-0 systemd[1]: Started libvirt nodedev daemon.
Oct 02 07:56:48 compute-0 sudo[152073]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:48 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct 02 07:56:48 compute-0 podman[152135]: 2025-10-02 07:56:48.081283415 +0000 UTC m=+0.111915376 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 02 07:56:48 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct 02 07:56:48 compute-0 sudo[152338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svtubfyimccvaphczywaaegtpfbdhdmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391808.1778443-2102-173133968940129/AnsiballZ_systemd.py'
Oct 02 07:56:48 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct 02 07:56:48 compute-0 sudo[152338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:48 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct 02 07:56:48 compute-0 python3.9[152343]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:56:48 compute-0 systemd[1]: Reloading.
Oct 02 07:56:48 compute-0 systemd-rc-local-generator[152375]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:56:48 compute-0 systemd-sysv-generator[152378]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:56:49 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct 02 07:56:49 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct 02 07:56:49 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct 02 07:56:49 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct 02 07:56:49 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 02 07:56:49 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 02 07:56:49 compute-0 sudo[152338]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:49 compute-0 setroubleshoot[152177]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l f8014fd1-28fb-487d-ae38-3bf312dbff6f
Oct 02 07:56:49 compute-0 setroubleshoot[152177]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Oct 02 07:56:49 compute-0 setroubleshoot[152177]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l f8014fd1-28fb-487d-ae38-3bf312dbff6f
Oct 02 07:56:49 compute-0 setroubleshoot[152177]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Oct 02 07:56:49 compute-0 sudo[152555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anbrrwpuqsggaekslqrrsfetncahtdmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391809.6120515-2102-20299460714998/AnsiballZ_systemd.py'
Oct 02 07:56:49 compute-0 sudo[152555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:50 compute-0 python3.9[152557]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:56:50 compute-0 systemd[1]: Reloading.
Oct 02 07:56:50 compute-0 systemd-rc-local-generator[152579]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:56:50 compute-0 systemd-sysv-generator[152582]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:56:50 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Oct 02 07:56:50 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Oct 02 07:56:50 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct 02 07:56:50 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct 02 07:56:50 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct 02 07:56:50 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct 02 07:56:50 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct 02 07:56:50 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct 02 07:56:50 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct 02 07:56:50 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct 02 07:56:50 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Oct 02 07:56:50 compute-0 systemd[1]: Started libvirt QEMU daemon.
Oct 02 07:56:50 compute-0 sudo[152555]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:51 compute-0 sudo[152768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ribqyqtqxiohkgnonphujiebrrrjpdth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391810.9498074-2102-171460936917528/AnsiballZ_systemd.py'
Oct 02 07:56:51 compute-0 sudo[152768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:51 compute-0 python3.9[152770]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:56:51 compute-0 systemd[1]: Reloading.
Oct 02 07:56:51 compute-0 systemd-rc-local-generator[152795]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:56:51 compute-0 systemd-sysv-generator[152799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:56:51 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Oct 02 07:56:51 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Oct 02 07:56:51 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Oct 02 07:56:51 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct 02 07:56:51 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct 02 07:56:51 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct 02 07:56:51 compute-0 systemd[1]: Starting libvirt secret daemon...
Oct 02 07:56:51 compute-0 systemd[1]: Started libvirt secret daemon.
Oct 02 07:56:52 compute-0 sudo[152768]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:52 compute-0 sudo[152979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmumnpiaopvntdummwbkbqzguhkdoden ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391812.2660115-2176-135896310520553/AnsiballZ_file.py'
Oct 02 07:56:52 compute-0 sudo[152979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:52 compute-0 python3.9[152981]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:52 compute-0 sudo[152979]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:53 compute-0 sudo[153131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sixzspxhjuomxpzxubnesjhbwxbyapom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391813.054411-2192-2143993304423/AnsiballZ_find.py'
Oct 02 07:56:53 compute-0 sudo[153131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:53 compute-0 python3.9[153133]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 02 07:56:53 compute-0 sudo[153131]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:54 compute-0 sudo[153283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unbxmcoujzppflyzxkgqlekvjbbabwjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391814.1874313-2220-168594128765801/AnsiballZ_stat.py'
Oct 02 07:56:54 compute-0 sudo[153283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:54 compute-0 python3.9[153285]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:54 compute-0 sudo[153283]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:55 compute-0 sudo[153406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngwgebehwpdvgqpcemaewiqoltudwgtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391814.1874313-2220-168594128765801/AnsiballZ_copy.py'
Oct 02 07:56:55 compute-0 sudo[153406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:55 compute-0 python3.9[153408]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391814.1874313-2220-168594128765801/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:55 compute-0 sudo[153406]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:56 compute-0 sudo[153558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggevhijejfuezjxzadbtzgyyoilyajfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391815.7433944-2252-138352015193246/AnsiballZ_file.py'
Oct 02 07:56:56 compute-0 sudo[153558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:56 compute-0 python3.9[153560]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:56 compute-0 sudo[153558]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:57 compute-0 sudo[153710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lufckbcctrpqtqjioktjdzapiqlqzwsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391816.5901859-2268-64101821262318/AnsiballZ_stat.py'
Oct 02 07:56:57 compute-0 sudo[153710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:57 compute-0 python3.9[153712]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:57 compute-0 sudo[153710]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:57 compute-0 sudo[153788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cigkvwwwaiwpmpklpyjnxvjvlffjncib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391816.5901859-2268-64101821262318/AnsiballZ_file.py'
Oct 02 07:56:57 compute-0 sudo[153788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:57 compute-0 python3.9[153790]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:57 compute-0 sudo[153788]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:58 compute-0 sudo[153940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tngitascqjssdbtnmarixocnenatbiim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391818.0542545-2292-54286071609482/AnsiballZ_stat.py'
Oct 02 07:56:58 compute-0 sudo[153940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:58 compute-0 python3.9[153942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:56:58 compute-0 sudo[153940]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:58 compute-0 sudo[154018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxbetiprmxgyjdunropvqhgvqkrrqdwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391818.0542545-2292-54286071609482/AnsiballZ_file.py'
Oct 02 07:56:58 compute-0 sudo[154018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:59 compute-0 python3.9[154020]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.hnsox7fb recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:56:59 compute-0 sudo[154018]: pam_unix(sudo:session): session closed for user root
Oct 02 07:56:59 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct 02 07:56:59 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.055s CPU time.
Oct 02 07:56:59 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct 02 07:56:59 compute-0 sudo[154170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlqufgkgdrgvgelbkeueobclxjvxfcjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391819.4109027-2316-95097727822199/AnsiballZ_stat.py'
Oct 02 07:56:59 compute-0 sudo[154170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:56:59 compute-0 python3.9[154172]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:00 compute-0 sudo[154170]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:00 compute-0 sudo[154248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyszhyggzwgiengnmfqjbtnrjppirahd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391819.4109027-2316-95097727822199/AnsiballZ_file.py'
Oct 02 07:57:00 compute-0 sudo[154248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:00 compute-0 python3.9[154250]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:00 compute-0 sudo[154248]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:01 compute-0 sudo[154400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhgxfcurdunpywpfvblgsnvtrlfmodgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391820.7077775-2342-185910129418648/AnsiballZ_command.py'
Oct 02 07:57:01 compute-0 sudo[154400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:01 compute-0 python3.9[154402]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:57:01 compute-0 sudo[154400]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:02 compute-0 sudo[154553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjazmqntukestulalyhekefgnanfoqxs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759391821.5068786-2358-23825017076561/AnsiballZ_edpm_nftables_from_files.py'
Oct 02 07:57:02 compute-0 sudo[154553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:02 compute-0 python3[154555]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 02 07:57:02 compute-0 sudo[154553]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:02 compute-0 sudo[154705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epgqpdfpzxkzqmmhhndqnrdgdhsreota ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391822.4452772-2374-218025243784832/AnsiballZ_stat.py'
Oct 02 07:57:02 compute-0 sudo[154705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:03 compute-0 python3.9[154707]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:03 compute-0 sudo[154705]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:03 compute-0 sudo[154783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrorkmngvexdwayazyzaxonuqvlczrli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391822.4452772-2374-218025243784832/AnsiballZ_file.py'
Oct 02 07:57:03 compute-0 sudo[154783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:03 compute-0 python3.9[154785]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:03 compute-0 sudo[154783]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:04 compute-0 sudo[154935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxqpmfxokwxwxrpbdomrdmwwjmtvskgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391823.8279762-2398-188194796531944/AnsiballZ_stat.py'
Oct 02 07:57:04 compute-0 sudo[154935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:04 compute-0 python3.9[154937]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:04 compute-0 sudo[154935]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:04 compute-0 sudo[155013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iakjaksyrtzicwuhhdcqivvxfhdpmiau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391823.8279762-2398-188194796531944/AnsiballZ_file.py'
Oct 02 07:57:04 compute-0 sudo[155013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:04 compute-0 python3.9[155015]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:04 compute-0 sudo[155013]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:05 compute-0 sudo[155165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhytagrwfixpnupnffkygrorvzwfvylh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391825.1887612-2422-181960070965555/AnsiballZ_stat.py'
Oct 02 07:57:05 compute-0 sudo[155165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:05 compute-0 python3.9[155167]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:05 compute-0 sudo[155165]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:06 compute-0 sudo[155243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pysldlhahwbjctdltgcrapbcxgjqmfne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391825.1887612-2422-181960070965555/AnsiballZ_file.py'
Oct 02 07:57:06 compute-0 sudo[155243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:06 compute-0 python3.9[155245]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:06 compute-0 sudo[155243]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:06 compute-0 sudo[155395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mummqiruvvtbggudgflqcnqcnbccgpup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391826.6270134-2446-232652229872717/AnsiballZ_stat.py'
Oct 02 07:57:06 compute-0 sudo[155395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:07 compute-0 python3.9[155397]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:07 compute-0 sudo[155395]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:07 compute-0 sudo[155473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvroxtgddzysitffoaahtwtdwuxuupjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391826.6270134-2446-232652229872717/AnsiballZ_file.py'
Oct 02 07:57:07 compute-0 sudo[155473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:07 compute-0 python3.9[155475]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:07 compute-0 sudo[155473]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:08 compute-0 sudo[155625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vftodxigpmknnatexjvhleaoxtppzfjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391827.8919759-2470-117012534642161/AnsiballZ_stat.py'
Oct 02 07:57:08 compute-0 sudo[155625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:08 compute-0 python3.9[155627]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:08 compute-0 sudo[155625]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:09 compute-0 sudo[155750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tszgafbkjnidmnpqlbcjvcvvcnperstr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391827.8919759-2470-117012534642161/AnsiballZ_copy.py'
Oct 02 07:57:09 compute-0 sudo[155750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:09 compute-0 python3.9[155752]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759391827.8919759-2470-117012534642161/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:09 compute-0 sudo[155750]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:09 compute-0 sudo[155902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivdadohvldhesevlexwutjjxcbtckunn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391829.4870732-2500-157401036851567/AnsiballZ_file.py'
Oct 02 07:57:09 compute-0 sudo[155902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:10 compute-0 python3.9[155904]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:10 compute-0 sudo[155902]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:10 compute-0 sudo[156054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdhnmzkxvxfkvmqukeiecifulsalpevz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391830.417065-2516-147534110060253/AnsiballZ_command.py'
Oct 02 07:57:10 compute-0 sudo[156054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:11 compute-0 python3.9[156056]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:57:11 compute-0 sudo[156054]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:11 compute-0 sudo[156209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcdzbyduzhfwirdixnydpxkkwakewvta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391831.297986-2532-3948155338432/AnsiballZ_blockinfile.py'
Oct 02 07:57:11 compute-0 sudo[156209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:12 compute-0 python3.9[156211]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:12 compute-0 sudo[156209]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:12 compute-0 sudo[156361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qktwidyrgnuoasqhibxlceninyzgmdwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391832.396287-2550-76107824972164/AnsiballZ_command.py'
Oct 02 07:57:12 compute-0 sudo[156361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:13 compute-0 python3.9[156363]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:57:13 compute-0 sudo[156361]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:13 compute-0 sudo[156514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ineodxxdaprwavkaytgmydulcuaattvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391833.2631679-2566-85558869069816/AnsiballZ_stat.py'
Oct 02 07:57:13 compute-0 sudo[156514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:13 compute-0 python3.9[156516]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:57:13 compute-0 sudo[156514]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:14 compute-0 sudo[156668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrpxljnyomxukfvccilyxpkrjfswtjlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391834.1138308-2582-213667283693032/AnsiballZ_command.py'
Oct 02 07:57:14 compute-0 sudo[156668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:14 compute-0 python3.9[156670]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:57:14 compute-0 sudo[156668]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:15 compute-0 sudo[156823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqyrgvqtdzabuzixzspbxisgbggemgst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391834.9327257-2598-12547399260717/AnsiballZ_file.py'
Oct 02 07:57:15 compute-0 sudo[156823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:15 compute-0 python3.9[156825]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:15 compute-0 sudo[156823]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:16 compute-0 sudo[156975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uruiywuavymrkqfuzwgxmzcuwdxausxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391835.6895401-2614-98280446655947/AnsiballZ_stat.py'
Oct 02 07:57:16 compute-0 sudo[156975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:16 compute-0 python3.9[156977]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:16 compute-0 sudo[156975]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:16 compute-0 sudo[157098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylyneftzofroqxwdtmszrhtmrhcxcofe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391835.6895401-2614-98280446655947/AnsiballZ_copy.py'
Oct 02 07:57:16 compute-0 sudo[157098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:16 compute-0 python3.9[157100]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391835.6895401-2614-98280446655947/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:16 compute-0 sudo[157098]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:17 compute-0 sudo[157250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltymdsfjjrwymxevvutsmzmfagzuvdkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391837.034302-2644-39938473485892/AnsiballZ_stat.py'
Oct 02 07:57:17 compute-0 sudo[157250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:17 compute-0 python3.9[157252]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:17 compute-0 sudo[157250]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:17 compute-0 sudo[157373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpqnolqowcerxmbaumwsjuokjaqrkbgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391837.034302-2644-39938473485892/AnsiballZ_copy.py'
Oct 02 07:57:17 compute-0 sudo[157373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:18 compute-0 python3.9[157375]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391837.034302-2644-39938473485892/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:18 compute-0 sudo[157373]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:18 compute-0 podman[157376]: 2025-10-02 07:57:18.167019607 +0000 UTC m=+0.078770158 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 02 07:57:18 compute-0 podman[157419]: 2025-10-02 07:57:18.283980941 +0000 UTC m=+0.089514295 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 02 07:57:18 compute-0 sudo[157570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgauycobrvebdffrblxgyhobelsoicju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391838.3237267-2674-91408780449150/AnsiballZ_stat.py'
Oct 02 07:57:18 compute-0 sudo[157570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:18 compute-0 python3.9[157572]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:18 compute-0 sudo[157570]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:19 compute-0 sudo[157693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkhzfwrienqtjllqvhqanyzndhggpkcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391838.3237267-2674-91408780449150/AnsiballZ_copy.py'
Oct 02 07:57:19 compute-0 sudo[157693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:19 compute-0 python3.9[157695]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391838.3237267-2674-91408780449150/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:19 compute-0 sudo[157693]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:19 compute-0 sudo[157845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uizfoaqjgnvuymrgmfclzfecobetwdpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391839.642469-2704-238220113063215/AnsiballZ_systemd.py'
Oct 02 07:57:19 compute-0 sudo[157845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:20 compute-0 python3.9[157847]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:57:20 compute-0 systemd[1]: Reloading.
Oct 02 07:57:20 compute-0 systemd-rc-local-generator[157872]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:57:20 compute-0 systemd-sysv-generator[157876]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:57:20 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Oct 02 07:57:20 compute-0 sudo[157845]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:21 compute-0 sudo[158036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mngulahelwyxcmztrgrvjsofngnttcbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391840.879028-2720-196526961109493/AnsiballZ_systemd.py'
Oct 02 07:57:21 compute-0 sudo[158036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:21 compute-0 python3.9[158038]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 02 07:57:21 compute-0 systemd[1]: Reloading.
Oct 02 07:57:21 compute-0 systemd-rc-local-generator[158065]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:57:21 compute-0 systemd-sysv-generator[158068]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:57:21 compute-0 systemd[1]: Reloading.
Oct 02 07:57:21 compute-0 systemd-sysv-generator[158106]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:57:21 compute-0 systemd-rc-local-generator[158102]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:57:22 compute-0 sudo[158036]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:22 compute-0 sshd-session[103822]: Connection closed by 192.168.122.30 port 52282
Oct 02 07:57:22 compute-0 sshd-session[103819]: pam_unix(sshd:session): session closed for user zuul
Oct 02 07:57:22 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Oct 02 07:57:22 compute-0 systemd[1]: session-24.scope: Consumed 3min 44.818s CPU time.
Oct 02 07:57:22 compute-0 systemd-logind[827]: Session 24 logged out. Waiting for processes to exit.
Oct 02 07:57:22 compute-0 systemd-logind[827]: Removed session 24.
Oct 02 07:57:28 compute-0 sshd-session[158133]: Accepted publickey for zuul from 192.168.122.30 port 43794 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 07:57:28 compute-0 systemd-logind[827]: New session 25 of user zuul.
Oct 02 07:57:28 compute-0 systemd[1]: Started Session 25 of User zuul.
Oct 02 07:57:28 compute-0 sshd-session[158133]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 07:57:29 compute-0 python3.9[158286]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:57:30 compute-0 sudo[158440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjwkhctntocoolwqydjdobvruvihomgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391849.897966-48-19954095418189/AnsiballZ_file.py'
Oct 02 07:57:30 compute-0 sudo[158440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:30 compute-0 python3.9[158442]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:57:30 compute-0 sudo[158440]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:31 compute-0 sudo[158592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhvaxliycrmyykcigxdcedfugocwoxzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391850.8470552-48-225366216802357/AnsiballZ_file.py'
Oct 02 07:57:31 compute-0 sudo[158592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:31 compute-0 python3.9[158594]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:57:31 compute-0 sudo[158592]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:31 compute-0 sudo[158744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhymljyfutomrzkwgqtjjfuopodztgzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391851.5371463-48-213354395021985/AnsiballZ_file.py'
Oct 02 07:57:31 compute-0 sudo[158744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:32 compute-0 python3.9[158746]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:57:32 compute-0 sudo[158744]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:32 compute-0 sudo[158896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbrwmdkomcvzetiwtkorjepdpqbhosyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391852.291661-48-200415397310366/AnsiballZ_file.py'
Oct 02 07:57:32 compute-0 sudo[158896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:32 compute-0 python3.9[158898]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 02 07:57:32 compute-0 sudo[158896]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:33 compute-0 sudo[159048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwwqymlwnoqwbdjncjbtfxjvcsiiqbod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391853.0683956-48-164597300278159/AnsiballZ_file.py'
Oct 02 07:57:33 compute-0 sudo[159048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:33 compute-0 python3.9[159050]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:57:33 compute-0 sudo[159048]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:34 compute-0 sudo[159200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovbcqpwijxuifhmawvxcolxdtbglnjah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391853.778936-120-151265190225897/AnsiballZ_stat.py'
Oct 02 07:57:34 compute-0 sudo[159200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:34 compute-0 python3.9[159202]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:57:34 compute-0 sudo[159200]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:35 compute-0 sudo[159354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sunmjpgmtdxamlejpgsfvmicmcjlyfok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391854.7571836-136-93828126060477/AnsiballZ_systemd.py'
Oct 02 07:57:35 compute-0 sudo[159354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:35 compute-0 python3.9[159356]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:57:35 compute-0 systemd[1]: Reloading.
Oct 02 07:57:36 compute-0 systemd-rc-local-generator[159388]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:57:36 compute-0 systemd-sysv-generator[159392]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:57:36 compute-0 sudo[159354]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:37 compute-0 sudo[159543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydhhgsizgxkkhrwstjwagxxerinytann ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391856.5029013-152-237758195362360/AnsiballZ_service_facts.py'
Oct 02 07:57:37 compute-0 sudo[159543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:37 compute-0 python3.9[159545]: ansible-ansible.builtin.service_facts Invoked
Oct 02 07:57:37 compute-0 network[159562]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 02 07:57:37 compute-0 network[159563]: 'network-scripts' will be removed from distribution in near future.
Oct 02 07:57:37 compute-0 network[159564]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 02 07:57:41 compute-0 sudo[159543]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:43 compute-0 sudo[159835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqjeonkqxfqfqifjqontflbtstulohng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391862.968791-168-133465712709264/AnsiballZ_systemd.py'
Oct 02 07:57:43 compute-0 sudo[159835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:43 compute-0 python3.9[159837]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:57:43 compute-0 systemd[1]: Reloading.
Oct 02 07:57:43 compute-0 systemd-rc-local-generator[159865]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:57:43 compute-0 systemd-sysv-generator[159869]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:57:44 compute-0 sudo[159835]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:44 compute-0 python3.9[160023]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:57:45 compute-0 sudo[160173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydssuzzmhantvlgyhwhyconarjfptyau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391865.1761417-202-108073453661101/AnsiballZ_podman_container.py'
Oct 02 07:57:45 compute-0 sudo[160173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:57:45.955 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 07:57:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:57:45.957 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 07:57:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:57:45.957 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 07:57:45 compute-0 python3.9[160175]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 02 07:57:46 compute-0 podman[160212]: 2025-10-02 07:57:46.215606772 +0000 UTC m=+0.059478997 container create 594ae8517fdc813cb46b0d3af9d996adeaecdb8303985317e128b4ff504cb7f6 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 02 07:57:46 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 02 07:57:46 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.2578] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/20)
Oct 02 07:57:46 compute-0 podman[160212]: 2025-10-02 07:57:46.187002245 +0000 UTC m=+0.030874530 image pull 81d94872551c3ae3c30801602bbb5f0c44872f15dcde472a0ba869fe2f28966e quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct 02 07:57:46 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Oct 02 07:57:46 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 02 07:57:46 compute-0 kernel: veth0: entered allmulticast mode
Oct 02 07:57:46 compute-0 kernel: veth0: entered promiscuous mode
Oct 02 07:57:46 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Oct 02 07:57:46 compute-0 kernel: podman0: port 1(veth0) entered forwarding state
Oct 02 07:57:46 compute-0 systemd-udevd[160231]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 07:57:46 compute-0 systemd-udevd[160233]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3112] device (veth0): carrier: link connected
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3126] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3139] device (podman0): carrier: link connected
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3187] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3199] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3208] device (podman0): Activation: starting connection 'podman0' (1658985f-d408-4d03-b4e6-057906f35491)
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3210] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3213] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3216] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3218] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 02 07:57:46 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 02 07:57:46 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3521] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3524] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.3533] device (podman0): Activation: successful, device activated.
Oct 02 07:57:46 compute-0 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct 02 07:57:46 compute-0 systemd[1]: Started libpod-conmon-594ae8517fdc813cb46b0d3af9d996adeaecdb8303985317e128b4ff504cb7f6.scope.
Oct 02 07:57:46 compute-0 systemd[1]: Started libcrun container.
Oct 02 07:57:46 compute-0 podman[160212]: 2025-10-02 07:57:46.650597232 +0000 UTC m=+0.494469477 container init 594ae8517fdc813cb46b0d3af9d996adeaecdb8303985317e128b4ff504cb7f6 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 07:57:46 compute-0 podman[160212]: 2025-10-02 07:57:46.660672949 +0000 UTC m=+0.504545154 container start 594ae8517fdc813cb46b0d3af9d996adeaecdb8303985317e128b4ff504cb7f6 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 07:57:46 compute-0 podman[160212]: 2025-10-02 07:57:46.664016334 +0000 UTC m=+0.507888559 container attach 594ae8517fdc813cb46b0d3af9d996adeaecdb8303985317e128b4ff504cb7f6 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct 02 07:57:46 compute-0 iscsid_config[160370]: iqn.1994-05.com.redhat:a713b87ac693
Oct 02 07:57:46 compute-0 systemd[1]: libpod-594ae8517fdc813cb46b0d3af9d996adeaecdb8303985317e128b4ff504cb7f6.scope: Deactivated successfully.
Oct 02 07:57:46 compute-0 podman[160212]: 2025-10-02 07:57:46.666109949 +0000 UTC m=+0.509982164 container died 594ae8517fdc813cb46b0d3af9d996adeaecdb8303985317e128b4ff504cb7f6 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 02 07:57:46 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 02 07:57:46 compute-0 kernel: veth0 (unregistering): left allmulticast mode
Oct 02 07:57:46 compute-0 kernel: veth0 (unregistering): left promiscuous mode
Oct 02 07:57:46 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 02 07:57:46 compute-0 NetworkManager[51654]: <info>  [1759391866.7459] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 07:57:47 compute-0 systemd[1]: run-netns-netns\x2d13644679\x2d5896\x2dca69\x2d8e65\x2d3b8d83a39cc3.mount: Deactivated successfully.
Oct 02 07:57:47 compute-0 podman[160212]: 2025-10-02 07:57:47.090622771 +0000 UTC m=+0.934495006 container remove 594ae8517fdc813cb46b0d3af9d996adeaecdb8303985317e128b4ff504cb7f6 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 02 07:57:47 compute-0 python3.9[160175]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True quay.io/podified-antelope-centos9/openstack-iscsid:current-podified /usr/sbin/iscsi-iname
Oct 02 07:57:47 compute-0 systemd[1]: libpod-conmon-594ae8517fdc813cb46b0d3af9d996adeaecdb8303985317e128b4ff504cb7f6.scope: Deactivated successfully.
Oct 02 07:57:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-e97002a92c48eea006f50e981c138de32518a4d902d28dc16084cccf56c9caae-merged.mount: Deactivated successfully.
Oct 02 07:57:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-594ae8517fdc813cb46b0d3af9d996adeaecdb8303985317e128b4ff504cb7f6-userdata-shm.mount: Deactivated successfully.
Oct 02 07:57:47 compute-0 python3.9[160175]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: 
                                             DEPRECATED command:
                                             It is recommended to use Quadlets for running containers and pods under systemd.
                                             
                                             Please refer to podman-systemd.unit(5) for details.
                                             Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct 02 07:57:47 compute-0 sudo[160173]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:47 compute-0 sudo[160610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kehnowdbpvklaribqfroaiijfbmetaas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391867.4678235-218-88181098080989/AnsiballZ_stat.py'
Oct 02 07:57:47 compute-0 sudo[160610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:47 compute-0 python3.9[160612]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:47 compute-0 sudo[160610]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:48 compute-0 sudo[160753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwcufxcidxobdhuboztzdihtfmrctpzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391867.4678235-218-88181098080989/AnsiballZ_copy.py'
Oct 02 07:57:48 compute-0 sudo[160753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:48 compute-0 podman[160707]: 2025-10-02 07:57:48.652170082 +0000 UTC m=+0.103780468 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 07:57:48 compute-0 podman[160708]: 2025-10-02 07:57:48.697102602 +0000 UTC m=+0.149147011 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 02 07:57:48 compute-0 python3.9[160764]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391867.4678235-218-88181098080989/.source.iscsi _original_basename=.agsrwz4s follow=False checksum=508b9dc02d5be80e9b467c3b3001d0435263627a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:48 compute-0 sudo[160753]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:49 compute-0 sudo[160926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spgdzjnzlnknctbxidcjsuxtdszsyrrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391869.097199-248-204950495306263/AnsiballZ_file.py'
Oct 02 07:57:49 compute-0 sudo[160926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:49 compute-0 python3.9[160928]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:49 compute-0 sudo[160926]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:50 compute-0 python3.9[161078]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:57:51 compute-0 sudo[161230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uweqvrupukjywtdkxfxwbxixqwphanvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391870.7674825-282-158844537087986/AnsiballZ_lineinfile.py'
Oct 02 07:57:51 compute-0 sudo[161230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:51 compute-0 python3.9[161232]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:51 compute-0 sudo[161230]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:52 compute-0 sudo[161382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjgkbpsracadqgbwkyfpmddculmqjnxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391871.8554578-300-77702539626533/AnsiballZ_file.py'
Oct 02 07:57:52 compute-0 sudo[161382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:52 compute-0 python3.9[161384]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:57:52 compute-0 sudo[161382]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:52 compute-0 sudo[161534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhkfthcykrjltujiazyrbvallgvwedli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391872.6548655-316-163672667830493/AnsiballZ_stat.py'
Oct 02 07:57:52 compute-0 sudo[161534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:53 compute-0 python3.9[161536]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:53 compute-0 sudo[161534]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:53 compute-0 sudo[161612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haumqeioceplitekvjkjhtksyehqaput ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391872.6548655-316-163672667830493/AnsiballZ_file.py'
Oct 02 07:57:53 compute-0 sudo[161612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:53 compute-0 python3.9[161614]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:57:53 compute-0 sudo[161612]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:54 compute-0 sudo[161764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wndxcqkqaxydrodtaaxnavbtypblaizr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391873.9039402-316-11900695654241/AnsiballZ_stat.py'
Oct 02 07:57:54 compute-0 sudo[161764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:54 compute-0 python3.9[161766]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:54 compute-0 sudo[161764]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:54 compute-0 sudo[161842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vomilxlnsbozsfonszzruwdsbmigbcxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391873.9039402-316-11900695654241/AnsiballZ_file.py'
Oct 02 07:57:54 compute-0 sudo[161842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:55 compute-0 python3.9[161844]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:57:55 compute-0 sudo[161842]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:55 compute-0 sudo[161994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjggpzjelkiqjgxbeokxfhmwhfoxwskl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391875.2920127-362-266279548111445/AnsiballZ_file.py'
Oct 02 07:57:55 compute-0 sudo[161994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:55 compute-0 python3.9[161996]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:55 compute-0 sudo[161994]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:56 compute-0 sudo[162146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gywkjsjkifyrekyrrujknkptfzsvmubc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391876.061613-378-9801534397427/AnsiballZ_stat.py'
Oct 02 07:57:56 compute-0 sudo[162146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:56 compute-0 python3.9[162148]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:56 compute-0 sudo[162146]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:56 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 02 07:57:57 compute-0 sudo[162225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccmbcjogzglodkoizlwzqapsvjujoatp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391876.061613-378-9801534397427/AnsiballZ_file.py'
Oct 02 07:57:57 compute-0 sudo[162225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:57 compute-0 python3.9[162227]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:57 compute-0 sudo[162225]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:57 compute-0 sudo[162377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duxjtqiasurpqabyyjkvvsoshvuewspq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391877.4594696-402-105325761430735/AnsiballZ_stat.py'
Oct 02 07:57:57 compute-0 sudo[162377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:58 compute-0 python3.9[162379]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:57:58 compute-0 sudo[162377]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:58 compute-0 sudo[162455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znipqryleendrijistmqaoonwlfkfwhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391877.4594696-402-105325761430735/AnsiballZ_file.py'
Oct 02 07:57:58 compute-0 sudo[162455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:58 compute-0 python3.9[162457]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:57:58 compute-0 sudo[162455]: pam_unix(sudo:session): session closed for user root
Oct 02 07:57:59 compute-0 sudo[162607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhbglxznbnlnnbhhgrrchvteubqzuocw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391879.0046923-426-130537465906541/AnsiballZ_systemd.py'
Oct 02 07:57:59 compute-0 sudo[162607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:57:59 compute-0 python3.9[162609]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:57:59 compute-0 systemd[1]: Reloading.
Oct 02 07:57:59 compute-0 systemd-sysv-generator[162638]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:57:59 compute-0 systemd-rc-local-generator[162630]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:58:00 compute-0 sudo[162607]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:00 compute-0 sudo[162797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpmjdrwinktbzvebsdcqxnnulffypucd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391880.3157244-442-27848918904837/AnsiballZ_stat.py'
Oct 02 07:58:00 compute-0 sudo[162797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:00 compute-0 python3.9[162799]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:58:01 compute-0 sudo[162797]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:01 compute-0 sudo[162875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpgsgashsqhocjuhfijnaqeylmlgdbuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391880.3157244-442-27848918904837/AnsiballZ_file.py'
Oct 02 07:58:01 compute-0 sudo[162875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:01 compute-0 python3.9[162877]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:01 compute-0 sudo[162875]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:02 compute-0 sudo[163027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfqxafiumhzngnktilvpmyzvaaoqeeiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391881.782947-466-267839973102913/AnsiballZ_stat.py'
Oct 02 07:58:02 compute-0 sudo[163027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:02 compute-0 python3.9[163029]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:58:02 compute-0 sudo[163027]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:02 compute-0 sudo[163105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmipopbbgtbcoaenfrpmgxgnjoezxqiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391881.782947-466-267839973102913/AnsiballZ_file.py'
Oct 02 07:58:02 compute-0 sudo[163105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:02 compute-0 python3.9[163107]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:02 compute-0 sudo[163105]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:03 compute-0 sudo[163257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egmljhipeohbdjjhullikylmxblqeefz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391883.198647-490-242671091549290/AnsiballZ_systemd.py'
Oct 02 07:58:03 compute-0 sudo[163257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:03 compute-0 python3.9[163259]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:58:03 compute-0 systemd[1]: Reloading.
Oct 02 07:58:03 compute-0 systemd-rc-local-generator[163289]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:58:03 compute-0 systemd-sysv-generator[163292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:58:04 compute-0 systemd[1]: Starting Create netns directory...
Oct 02 07:58:04 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 02 07:58:04 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 02 07:58:04 compute-0 systemd[1]: Finished Create netns directory.
Oct 02 07:58:04 compute-0 sudo[163257]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:04 compute-0 sudo[163451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnglqjifwrvceadybkbbvdgmdubcskve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391884.5794423-510-145532661197218/AnsiballZ_file.py'
Oct 02 07:58:04 compute-0 sudo[163451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:05 compute-0 python3.9[163453]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:58:05 compute-0 sudo[163451]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:05 compute-0 sudo[163603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmjhsrnnvrreusyaigwolamcfpndionx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391885.3896322-526-235103661342291/AnsiballZ_stat.py'
Oct 02 07:58:05 compute-0 sudo[163603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:05 compute-0 python3.9[163605]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:58:05 compute-0 sudo[163603]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:06 compute-0 sudo[163726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdaxlnmfxmwxbglizruasnttoeexhotu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391885.3896322-526-235103661342291/AnsiballZ_copy.py'
Oct 02 07:58:06 compute-0 sudo[163726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:06 compute-0 python3.9[163728]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759391885.3896322-526-235103661342291/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:58:06 compute-0 sudo[163726]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:07 compute-0 sudo[163878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbqrvsjucqdsuljemdesjadgoxbdhjky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391887.1091573-560-40829390954394/AnsiballZ_file.py'
Oct 02 07:58:07 compute-0 sudo[163878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:07 compute-0 python3.9[163880]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:58:07 compute-0 sudo[163878]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:08 compute-0 sudo[164032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drsviekoyhefebeqnvchuctihiusloyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391887.9807055-576-74626039805511/AnsiballZ_stat.py'
Oct 02 07:58:08 compute-0 sudo[164032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:08 compute-0 python3.9[164034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:58:08 compute-0 sudo[164032]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:08 compute-0 unix_chkpwd[164081]: password check failed for user (root)
Oct 02 07:58:08 compute-0 sshd-session[163905]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Oct 02 07:58:08 compute-0 sudo[164156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnkhverclphkkmzemurplnukakykghpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391887.9807055-576-74626039805511/AnsiballZ_copy.py'
Oct 02 07:58:08 compute-0 sudo[164156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:09 compute-0 python3.9[164158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391887.9807055-576-74626039805511/.source.json _original_basename=.0kvs5wbe follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:09 compute-0 sudo[164156]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:09 compute-0 sudo[164308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpmtmwhpmorjguvfzkeexfcwczrbdjok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391889.417946-606-36021428843442/AnsiballZ_file.py'
Oct 02 07:58:09 compute-0 sudo[164308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:10 compute-0 python3.9[164310]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:10 compute-0 sudo[164308]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:10 compute-0 sshd-session[163905]: Failed password for root from 193.46.255.7 port 50980 ssh2
Oct 02 07:58:10 compute-0 sudo[164460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgcssargqrsmdfjmefifgvfpftnvvvkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391890.393685-622-157674663094501/AnsiballZ_stat.py'
Oct 02 07:58:10 compute-0 sudo[164460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:10 compute-0 unix_chkpwd[164463]: password check failed for user (root)
Oct 02 07:58:10 compute-0 sudo[164460]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:11 compute-0 sudo[164584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmalsqbqtgzyhakboodfxzdrysqxmgyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391890.393685-622-157674663094501/AnsiballZ_copy.py'
Oct 02 07:58:11 compute-0 sudo[164584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:11 compute-0 sudo[164584]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:12 compute-0 sudo[164736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckilmqsthbqfnioooonmchaumlyebmhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391892.0457175-656-89335570583480/AnsiballZ_container_config_data.py'
Oct 02 07:58:12 compute-0 sudo[164736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:12 compute-0 python3.9[164738]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct 02 07:58:12 compute-0 sudo[164736]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:13 compute-0 sshd-session[163905]: Failed password for root from 193.46.255.7 port 50980 ssh2
Oct 02 07:58:13 compute-0 sudo[164888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psqmuvratpqioknwalyiwqstrziqhbmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391893.1424887-674-215553370454255/AnsiballZ_container_config_hash.py'
Oct 02 07:58:13 compute-0 sudo[164888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:13 compute-0 python3.9[164890]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 02 07:58:13 compute-0 sudo[164888]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:14 compute-0 sudo[165040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rthahsyttmtmfquclraaaigwzjigzbjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391894.2464495-692-129853370223400/AnsiballZ_podman_container_info.py'
Oct 02 07:58:14 compute-0 sudo[165040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:15 compute-0 unix_chkpwd[165043]: password check failed for user (root)
Oct 02 07:58:15 compute-0 python3.9[165042]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 02 07:58:15 compute-0 sudo[165040]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:16 compute-0 sudo[165219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmbqfmosxfvumsznqkefluibuyrnkulo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759391896.082114-718-61110361146806/AnsiballZ_edpm_container_manage.py'
Oct 02 07:58:16 compute-0 sudo[165219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:16 compute-0 python3[165221]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 02 07:58:17 compute-0 sshd-session[163905]: Failed password for root from 193.46.255.7 port 50980 ssh2
Oct 02 07:58:17 compute-0 podman[165259]: 2025-10-02 07:58:17.165130381 +0000 UTC m=+0.078129152 container create c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 02 07:58:17 compute-0 podman[165259]: 2025-10-02 07:58:17.120238423 +0000 UTC m=+0.033237274 image pull 81d94872551c3ae3c30801602bbb5f0c44872f15dcde472a0ba869fe2f28966e quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct 02 07:58:17 compute-0 python3[165221]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct 02 07:58:17 compute-0 sudo[165219]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:17 compute-0 sudo[165447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meguipemqwzabbuttwrugrjbnvvektok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391897.6181526-734-198804945121066/AnsiballZ_stat.py'
Oct 02 07:58:17 compute-0 sudo[165447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:18 compute-0 python3.9[165449]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:58:18 compute-0 sudo[165447]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:19 compute-0 sudo[165624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oruopiifzghonptzlnnqqflcivspbchh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391898.6539924-752-30035232287734/AnsiballZ_file.py'
Oct 02 07:58:19 compute-0 sudo[165624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:19 compute-0 podman[165575]: 2025-10-02 07:58:19.015011589 +0000 UTC m=+0.074529079 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 02 07:58:19 compute-0 podman[165576]: 2025-10-02 07:58:19.053018212 +0000 UTC m=+0.107589067 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 02 07:58:19 compute-0 sshd-session[163905]: Received disconnect from 193.46.255.7 port 50980:11:  [preauth]
Oct 02 07:58:19 compute-0 sshd-session[163905]: Disconnected from authenticating user root 193.46.255.7 port 50980 [preauth]
Oct 02 07:58:19 compute-0 sshd-session[163905]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Oct 02 07:58:19 compute-0 python3.9[165640]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:19 compute-0 sudo[165624]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:19 compute-0 sudo[165723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvgzhwemiseeeobvwwkyphihegrurqcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391898.6539924-752-30035232287734/AnsiballZ_stat.py'
Oct 02 07:58:19 compute-0 sudo[165723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:19 compute-0 python3.9[165725]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:58:19 compute-0 sudo[165723]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:20 compute-0 unix_chkpwd[165832]: password check failed for user (root)
Oct 02 07:58:20 compute-0 sshd-session[165671]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Oct 02 07:58:20 compute-0 sudo[165875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibfqpiszdmqaodiyidcauppkzlghosla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391899.6704998-752-209496906176694/AnsiballZ_copy.py'
Oct 02 07:58:20 compute-0 sudo[165875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:20 compute-0 python3.9[165877]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759391899.6704998-752-209496906176694/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:20 compute-0 sudo[165875]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:20 compute-0 sudo[165951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swqxlouaicklcxkdbkwsuhendtlzqnul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391899.6704998-752-209496906176694/AnsiballZ_systemd.py'
Oct 02 07:58:20 compute-0 sudo[165951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:20 compute-0 python3.9[165953]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 07:58:20 compute-0 systemd[1]: Reloading.
Oct 02 07:58:21 compute-0 systemd-rc-local-generator[165982]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:58:21 compute-0 systemd-sysv-generator[165985]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:58:21 compute-0 sudo[165951]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:21 compute-0 sudo[166063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkixfrstzpekmskjtxcomoozloxvrywv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391899.6704998-752-209496906176694/AnsiballZ_systemd.py'
Oct 02 07:58:21 compute-0 sudo[166063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:22 compute-0 python3.9[166065]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:58:22 compute-0 systemd[1]: Reloading.
Oct 02 07:58:22 compute-0 systemd-sysv-generator[166101]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:58:22 compute-0 systemd-rc-local-generator[166098]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:58:22 compute-0 systemd[1]: Starting iscsid container...
Oct 02 07:58:22 compute-0 sshd-session[165671]: Failed password for root from 193.46.255.7 port 24390 ssh2
Oct 02 07:58:22 compute-0 systemd[1]: Started libcrun container.
Oct 02 07:58:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11b64b5c5398aa47d447a37d2dc52d04c6651c3d5aedad56a64a6344f39874c2/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct 02 07:58:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11b64b5c5398aa47d447a37d2dc52d04c6651c3d5aedad56a64a6344f39874c2/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 02 07:58:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11b64b5c5398aa47d447a37d2dc52d04c6651c3d5aedad56a64a6344f39874c2/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 02 07:58:22 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0.
Oct 02 07:58:22 compute-0 podman[166106]: 2025-10-02 07:58:22.612713646 +0000 UTC m=+0.164091101 container init c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 07:58:22 compute-0 iscsid[166121]: + sudo -E kolla_set_configs
Oct 02 07:58:22 compute-0 podman[166106]: 2025-10-02 07:58:22.645556186 +0000 UTC m=+0.196933621 container start c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3)
Oct 02 07:58:22 compute-0 podman[166106]: iscsid
Oct 02 07:58:22 compute-0 sudo[166127]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 02 07:58:22 compute-0 systemd[1]: Started iscsid container.
Oct 02 07:58:22 compute-0 systemd[1]: Created slice User Slice of UID 0.
Oct 02 07:58:22 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 02 07:58:22 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 02 07:58:22 compute-0 sudo[166063]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:22 compute-0 systemd[1]: Starting User Manager for UID 0...
Oct 02 07:58:22 compute-0 systemd[166140]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Oct 02 07:58:22 compute-0 podman[166128]: 2025-10-02 07:58:22.731969678 +0000 UTC m=+0.070698710 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 02 07:58:22 compute-0 systemd[1]: c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0-226c09828ef65f02.service: Main process exited, code=exited, status=1/FAILURE
Oct 02 07:58:22 compute-0 systemd[1]: c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0-226c09828ef65f02.service: Failed with result 'exit-code'.
Oct 02 07:58:22 compute-0 systemd[166140]: Queued start job for default target Main User Target.
Oct 02 07:58:22 compute-0 systemd[166140]: Created slice User Application Slice.
Oct 02 07:58:22 compute-0 systemd[166140]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 02 07:58:22 compute-0 systemd[166140]: Started Daily Cleanup of User's Temporary Directories.
Oct 02 07:58:22 compute-0 systemd[166140]: Reached target Paths.
Oct 02 07:58:22 compute-0 systemd[166140]: Reached target Timers.
Oct 02 07:58:22 compute-0 systemd[166140]: Starting D-Bus User Message Bus Socket...
Oct 02 07:58:22 compute-0 systemd[166140]: Starting Create User's Volatile Files and Directories...
Oct 02 07:58:22 compute-0 systemd[166140]: Listening on D-Bus User Message Bus Socket.
Oct 02 07:58:22 compute-0 systemd[166140]: Reached target Sockets.
Oct 02 07:58:22 compute-0 systemd[166140]: Finished Create User's Volatile Files and Directories.
Oct 02 07:58:22 compute-0 systemd[166140]: Reached target Basic System.
Oct 02 07:58:22 compute-0 systemd[166140]: Reached target Main User Target.
Oct 02 07:58:22 compute-0 systemd[166140]: Startup finished in 137ms.
Oct 02 07:58:22 compute-0 systemd[1]: Started User Manager for UID 0.
Oct 02 07:58:22 compute-0 systemd[1]: Started Session c3 of User root.
Oct 02 07:58:22 compute-0 sudo[166127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 02 07:58:22 compute-0 iscsid[166121]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 02 07:58:22 compute-0 iscsid[166121]: INFO:__main__:Validating config file
Oct 02 07:58:22 compute-0 iscsid[166121]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 02 07:58:22 compute-0 iscsid[166121]: INFO:__main__:Writing out command to execute
Oct 02 07:58:22 compute-0 sudo[166127]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:22 compute-0 systemd[1]: session-c3.scope: Deactivated successfully.
Oct 02 07:58:22 compute-0 iscsid[166121]: ++ cat /run_command
Oct 02 07:58:22 compute-0 iscsid[166121]: + CMD='/usr/sbin/iscsid -f'
Oct 02 07:58:22 compute-0 iscsid[166121]: + ARGS=
Oct 02 07:58:22 compute-0 iscsid[166121]: + sudo kolla_copy_cacerts
Oct 02 07:58:22 compute-0 sudo[166218]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 02 07:58:23 compute-0 systemd[1]: Started Session c4 of User root.
Oct 02 07:58:23 compute-0 sudo[166218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 02 07:58:23 compute-0 sudo[166218]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:23 compute-0 systemd[1]: session-c4.scope: Deactivated successfully.
Oct 02 07:58:23 compute-0 iscsid[166121]: + [[ ! -n '' ]]
Oct 02 07:58:23 compute-0 iscsid[166121]: + . kolla_extend_start
Oct 02 07:58:23 compute-0 iscsid[166121]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct 02 07:58:23 compute-0 iscsid[166121]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct 02 07:58:23 compute-0 iscsid[166121]: Running command: '/usr/sbin/iscsid -f'
Oct 02 07:58:23 compute-0 iscsid[166121]: + umask 0022
Oct 02 07:58:23 compute-0 iscsid[166121]: + exec /usr/sbin/iscsid -f
Oct 02 07:58:23 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Oct 02 07:58:23 compute-0 python3.9[166324]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:58:24 compute-0 sudo[166474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chgtrvjhjwnuesmhoaeiwaxlzycykrtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391903.7241185-826-122727101915152/AnsiballZ_file.py'
Oct 02 07:58:24 compute-0 sudo[166474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:24 compute-0 python3.9[166476]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:24 compute-0 sudo[166474]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:24 compute-0 unix_chkpwd[166477]: password check failed for user (root)
Oct 02 07:58:25 compute-0 sudo[166627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmlambicyxzwszvggdqdvmhsszolvctf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391904.7307389-848-3252588671988/AnsiballZ_service_facts.py'
Oct 02 07:58:25 compute-0 sudo[166627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:25 compute-0 python3.9[166629]: ansible-ansible.builtin.service_facts Invoked
Oct 02 07:58:25 compute-0 network[166646]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 02 07:58:25 compute-0 network[166647]: 'network-scripts' will be removed from distribution in near future.
Oct 02 07:58:25 compute-0 network[166648]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 02 07:58:25 compute-0 sshd-session[165671]: Failed password for root from 193.46.255.7 port 24390 ssh2
Oct 02 07:58:26 compute-0 unix_chkpwd[166668]: password check failed for user (root)
Oct 02 07:58:28 compute-0 sshd-session[166659]: Invalid user supervisor from 128.185.215.38 port 57024
Oct 02 07:58:28 compute-0 sshd-session[166659]: pam_unix(sshd:auth): check pass; user unknown
Oct 02 07:58:28 compute-0 sshd-session[166659]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=128.185.215.38
Oct 02 07:58:28 compute-0 sshd-session[165671]: Failed password for root from 193.46.255.7 port 24390 ssh2
Oct 02 07:58:28 compute-0 sshd-session[165671]: Received disconnect from 193.46.255.7 port 24390:11:  [preauth]
Oct 02 07:58:28 compute-0 sshd-session[165671]: Disconnected from authenticating user root 193.46.255.7 port 24390 [preauth]
Oct 02 07:58:28 compute-0 sshd-session[165671]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Oct 02 07:58:29 compute-0 sudo[166627]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:29 compute-0 unix_chkpwd[166800]: password check failed for user (root)
Oct 02 07:58:29 compute-0 sshd-session[166756]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Oct 02 07:58:29 compute-0 sudo[166926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxjghmxfesnwqkaqgelhimrxqsftifyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391909.549875-868-169665727598952/AnsiballZ_file.py'
Oct 02 07:58:29 compute-0 sudo[166926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:30 compute-0 sshd-session[166659]: Failed password for invalid user supervisor from 128.185.215.38 port 57024 ssh2
Oct 02 07:58:30 compute-0 python3.9[166928]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 02 07:58:30 compute-0 sudo[166926]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:30 compute-0 sshd-session[166659]: Connection closed by invalid user supervisor 128.185.215.38 port 57024 [preauth]
Oct 02 07:58:31 compute-0 sudo[167078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-copkivtymyiswvtcdertcsjcqmzrmhkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391910.4938915-884-167422835231057/AnsiballZ_modprobe.py'
Oct 02 07:58:31 compute-0 sudo[167078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:31 compute-0 sshd-session[166756]: Failed password for root from 193.46.255.7 port 37010 ssh2
Oct 02 07:58:31 compute-0 python3.9[167080]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct 02 07:58:31 compute-0 sudo[167078]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:31 compute-0 unix_chkpwd[167132]: password check failed for user (root)
Oct 02 07:58:31 compute-0 sudo[167235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-japxwskdmjnirpnmhxclipazbdtrssti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391911.5565026-900-77598690157211/AnsiballZ_stat.py'
Oct 02 07:58:31 compute-0 sudo[167235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:32 compute-0 python3.9[167237]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:58:32 compute-0 sudo[167235]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:32 compute-0 sudo[167358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryjqtpepdwenxgysgtktvfegyiiovcjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391911.5565026-900-77598690157211/AnsiballZ_copy.py'
Oct 02 07:58:32 compute-0 sudo[167358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:32 compute-0 python3.9[167360]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391911.5565026-900-77598690157211/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:32 compute-0 sudo[167358]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:33 compute-0 systemd[1]: Stopping User Manager for UID 0...
Oct 02 07:58:33 compute-0 systemd[166140]: Activating special unit Exit the Session...
Oct 02 07:58:33 compute-0 systemd[166140]: Stopped target Main User Target.
Oct 02 07:58:33 compute-0 systemd[166140]: Stopped target Basic System.
Oct 02 07:58:33 compute-0 systemd[166140]: Stopped target Paths.
Oct 02 07:58:33 compute-0 systemd[166140]: Stopped target Sockets.
Oct 02 07:58:33 compute-0 systemd[166140]: Stopped target Timers.
Oct 02 07:58:33 compute-0 systemd[166140]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 02 07:58:33 compute-0 systemd[166140]: Closed D-Bus User Message Bus Socket.
Oct 02 07:58:33 compute-0 systemd[166140]: Stopped Create User's Volatile Files and Directories.
Oct 02 07:58:33 compute-0 systemd[166140]: Removed slice User Application Slice.
Oct 02 07:58:33 compute-0 systemd[166140]: Reached target Shutdown.
Oct 02 07:58:33 compute-0 systemd[166140]: Finished Exit the Session.
Oct 02 07:58:33 compute-0 systemd[166140]: Reached target Exit the Session.
Oct 02 07:58:33 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Oct 02 07:58:33 compute-0 systemd[1]: Stopped User Manager for UID 0.
Oct 02 07:58:33 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 02 07:58:33 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 02 07:58:33 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 02 07:58:33 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 02 07:58:33 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Oct 02 07:58:33 compute-0 sudo[167511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyqkvbwvwtgdlxtdsfvmakrkbnyreuwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391913.0821996-932-88762171764444/AnsiballZ_lineinfile.py'
Oct 02 07:58:33 compute-0 sudo[167511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:33 compute-0 sshd-session[166756]: Failed password for root from 193.46.255.7 port 37010 ssh2
Oct 02 07:58:33 compute-0 python3.9[167513]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:33 compute-0 sudo[167511]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:33 compute-0 unix_chkpwd[167538]: password check failed for user (root)
Oct 02 07:58:34 compute-0 sudo[167664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfuytqmrzlgrzegilqbimaxtdmogynld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391913.8091166-948-248886254394895/AnsiballZ_systemd.py'
Oct 02 07:58:34 compute-0 sudo[167664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:34 compute-0 python3.9[167666]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:58:34 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 02 07:58:34 compute-0 systemd[1]: Stopped Load Kernel Modules.
Oct 02 07:58:34 compute-0 systemd[1]: Stopping Load Kernel Modules...
Oct 02 07:58:34 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 02 07:58:34 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 02 07:58:34 compute-0 sudo[167664]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:35 compute-0 sudo[167820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvgqgcawexniwgyvgucmeurpuzmffwid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391914.748904-964-17320144040755/AnsiballZ_file.py'
Oct 02 07:58:35 compute-0 sudo[167820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:35 compute-0 python3.9[167822]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:58:35 compute-0 sudo[167820]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:35 compute-0 sshd-session[166756]: Failed password for root from 193.46.255.7 port 37010 ssh2
Oct 02 07:58:35 compute-0 sshd-session[166756]: Received disconnect from 193.46.255.7 port 37010:11:  [preauth]
Oct 02 07:58:35 compute-0 sshd-session[166756]: Disconnected from authenticating user root 193.46.255.7 port 37010 [preauth]
Oct 02 07:58:35 compute-0 sshd-session[166756]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.7  user=root
Oct 02 07:58:35 compute-0 sudo[167972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnvwzrrtoibzpmvfofiwsynelwflkrqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391915.5765681-982-219574104415027/AnsiballZ_stat.py'
Oct 02 07:58:35 compute-0 sudo[167972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:36 compute-0 python3.9[167974]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:58:36 compute-0 sudo[167972]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:36 compute-0 sudo[168124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwgbpjakckkmkjtgporvcgqwsnbqddwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391916.4899855-1000-2452926882537/AnsiballZ_stat.py'
Oct 02 07:58:36 compute-0 sudo[168124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:37 compute-0 python3.9[168126]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:58:37 compute-0 sudo[168124]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:37 compute-0 sudo[168276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzqfnlfnvozrapxrxawbwygtglslbsrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391917.3479958-1016-215360420315675/AnsiballZ_stat.py'
Oct 02 07:58:37 compute-0 sudo[168276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:37 compute-0 python3.9[168278]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:58:37 compute-0 sudo[168276]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:38 compute-0 sudo[168399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iekdmmzgiuyzuocttxczlhtnchcxbdyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391917.3479958-1016-215360420315675/AnsiballZ_copy.py'
Oct 02 07:58:38 compute-0 sudo[168399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:38 compute-0 python3.9[168401]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391917.3479958-1016-215360420315675/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:38 compute-0 sudo[168399]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:39 compute-0 sudo[168551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxmgbnttxesvbifrkjbkbwsccjxmaxov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391918.8755648-1046-236365380771392/AnsiballZ_command.py'
Oct 02 07:58:39 compute-0 sudo[168551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:39 compute-0 python3.9[168553]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:58:39 compute-0 sudo[168551]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:40 compute-0 sudo[168704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cupvqyzhdlvyyeztsntwlhhgkswevmsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391919.876996-1062-153239524504933/AnsiballZ_lineinfile.py'
Oct 02 07:58:40 compute-0 sudo[168704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:40 compute-0 python3.9[168706]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:40 compute-0 sudo[168704]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:41 compute-0 sudo[168856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjqblhqcdhawyqmnjiglyxznobznijto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391920.7101407-1078-11319301864126/AnsiballZ_replace.py'
Oct 02 07:58:41 compute-0 sudo[168856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:41 compute-0 python3.9[168858]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:41 compute-0 sudo[168856]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:42 compute-0 sudo[169008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaqrbiraelqcnalfkgjefwplejhezlyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391921.7657762-1094-245077759927980/AnsiballZ_replace.py'
Oct 02 07:58:42 compute-0 sudo[169008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:42 compute-0 python3.9[169010]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:42 compute-0 sudo[169008]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:42 compute-0 sudo[169160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jygdatylyegofjzumaunvxegpgahjrdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391922.641463-1112-26618165479272/AnsiballZ_lineinfile.py'
Oct 02 07:58:42 compute-0 sudo[169160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:43 compute-0 python3.9[169162]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:43 compute-0 sudo[169160]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:43 compute-0 sudo[169312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwdtanybhsvrfzkscyjnujmjkoafadeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391923.3632643-1112-112667266818836/AnsiballZ_lineinfile.py'
Oct 02 07:58:43 compute-0 sudo[169312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:43 compute-0 python3.9[169314]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:43 compute-0 sudo[169312]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:44 compute-0 sudo[169464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vedggrcyiozhtuekjzhrzxbbbilmscaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391924.2009406-1112-121750083954356/AnsiballZ_lineinfile.py'
Oct 02 07:58:44 compute-0 sudo[169464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:44 compute-0 python3.9[169466]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:44 compute-0 sudo[169464]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:45 compute-0 sudo[169616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmorceatpkqmluvzfertlyinneaesfdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391924.9502738-1112-140218628731330/AnsiballZ_lineinfile.py'
Oct 02 07:58:45 compute-0 sudo[169616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:45 compute-0 python3.9[169618]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:45 compute-0 sudo[169616]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:58:45.957 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 07:58:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:58:45.958 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 07:58:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:58:45.958 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 07:58:46 compute-0 sudo[169768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kysvwymcpinqjfcnjwurxovdlnvpmbrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391925.7407994-1170-145389625176622/AnsiballZ_stat.py'
Oct 02 07:58:46 compute-0 sudo[169768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:46 compute-0 python3.9[169770]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:58:46 compute-0 sudo[169768]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:46 compute-0 sudo[169922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hopxvdhqcshblsvmgeloxsqoryewjoqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391926.5367794-1186-151213823522075/AnsiballZ_file.py'
Oct 02 07:58:46 compute-0 sudo[169922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:47 compute-0 python3.9[169924]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:47 compute-0 sudo[169922]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:47 compute-0 sudo[170074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnsfrywarxzxwsaohlqszcalkeuvvllz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391927.4561965-1204-1909053858738/AnsiballZ_file.py'
Oct 02 07:58:47 compute-0 sudo[170074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:47 compute-0 python3.9[170076]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:58:47 compute-0 sudo[170074]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:48 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct 02 07:58:48 compute-0 sudo[170227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oovpkchaggsxeroskkfuwmpwjsllalai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391928.2280056-1220-121343742124107/AnsiballZ_stat.py'
Oct 02 07:58:48 compute-0 sudo[170227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:48 compute-0 python3.9[170229]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:58:48 compute-0 sudo[170227]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:49 compute-0 sudo[170334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkdgvrqyqntamuxjdcsvuagwblsegftx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391928.2280056-1220-121343742124107/AnsiballZ_file.py'
Oct 02 07:58:49 compute-0 podman[170276]: 2025-10-02 07:58:49.163302138 +0000 UTC m=+0.070256671 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 07:58:49 compute-0 sudo[170334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:49 compute-0 podman[170280]: 2025-10-02 07:58:49.195104408 +0000 UTC m=+0.096831807 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct 02 07:58:49 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 02 07:58:49 compute-0 python3.9[170341]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:58:49 compute-0 sudo[170334]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:49 compute-0 sudo[170502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtgiddnaaricfzvuxpfjlxznnetmbezk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391929.5367913-1220-10784638156045/AnsiballZ_stat.py'
Oct 02 07:58:49 compute-0 sudo[170502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:50 compute-0 python3.9[170504]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:58:50 compute-0 sudo[170502]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:50 compute-0 sudo[170580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fheedmwriswvomnjkcruuyngpfufoxsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391929.5367913-1220-10784638156045/AnsiballZ_file.py'
Oct 02 07:58:50 compute-0 sudo[170580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:50 compute-0 python3.9[170582]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:58:50 compute-0 sudo[170580]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:50 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Oct 02 07:58:51 compute-0 sudo[170733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gebbaeiqatcllijzvdhoafhtfqejbhpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391930.8467994-1266-240968775916326/AnsiballZ_file.py'
Oct 02 07:58:51 compute-0 sudo[170733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:51 compute-0 python3.9[170735]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:51 compute-0 sudo[170733]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:52 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct 02 07:58:52 compute-0 sudo[170886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsibryrglwiknbhjdxmfdsdhwcudzwwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391931.6605477-1282-268618015419883/AnsiballZ_stat.py'
Oct 02 07:58:52 compute-0 sudo[170886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:52 compute-0 python3.9[170888]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:58:52 compute-0 sudo[170886]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:52 compute-0 sudo[170964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogqnfhuwoygatpzwchdubvnonatcynig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391931.6605477-1282-268618015419883/AnsiballZ_file.py'
Oct 02 07:58:52 compute-0 sudo[170964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:52 compute-0 python3.9[170966]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:52 compute-0 sudo[170964]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:53 compute-0 podman[171021]: 2025-10-02 07:58:53.218268885 +0000 UTC m=+0.121996870 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 02 07:58:53 compute-0 sudo[171136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmwhmnegfkoyztvjatikmzgfkkedzqqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391933.0285916-1306-165547679519431/AnsiballZ_stat.py'
Oct 02 07:58:53 compute-0 sudo[171136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:53 compute-0 python3.9[171138]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:58:53 compute-0 sudo[171136]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:54 compute-0 sudo[171214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnpjhubgtwypwyrykjzhwcpoogxrqmaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391933.0285916-1306-165547679519431/AnsiballZ_file.py'
Oct 02 07:58:54 compute-0 sudo[171214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:54 compute-0 python3.9[171216]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:54 compute-0 sudo[171214]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:54 compute-0 sudo[171366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nufjwlbmkirlulvhqiztsdzewicdrzkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391934.3743382-1330-6552075937642/AnsiballZ_systemd.py'
Oct 02 07:58:54 compute-0 sudo[171366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:54 compute-0 python3.9[171368]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:58:54 compute-0 systemd[1]: Reloading.
Oct 02 07:58:55 compute-0 systemd-rc-local-generator[171392]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:58:55 compute-0 systemd-sysv-generator[171398]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:58:55 compute-0 sudo[171366]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:55 compute-0 sudo[171556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbetvkfqaqwjfwhzpecporzlevyonpoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391935.6006804-1346-129465047331070/AnsiballZ_stat.py'
Oct 02 07:58:55 compute-0 sudo[171556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:56 compute-0 python3.9[171558]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:58:56 compute-0 sudo[171556]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:56 compute-0 sudo[171634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aailsoxvpldzvmvcunxoqihdxxjbzlfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391935.6006804-1346-129465047331070/AnsiballZ_file.py'
Oct 02 07:58:56 compute-0 sudo[171634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:56 compute-0 python3.9[171636]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:56 compute-0 sudo[171634]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:57 compute-0 sudo[171786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qilguqeteoqdbahtwndgvvbxnwvcxeca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391937.0244503-1370-152420861580542/AnsiballZ_stat.py'
Oct 02 07:58:57 compute-0 sudo[171786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:57 compute-0 python3.9[171788]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:58:57 compute-0 sudo[171786]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:58 compute-0 sudo[171864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yassnjuedfdotwdpptsccbwilluazftf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391937.0244503-1370-152420861580542/AnsiballZ_file.py'
Oct 02 07:58:58 compute-0 sudo[171864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:58 compute-0 python3.9[171866]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:58:58 compute-0 sudo[171864]: pam_unix(sudo:session): session closed for user root
Oct 02 07:58:58 compute-0 sudo[172016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqbsduxujfgboqusotulcuwwcjbfbjwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391938.4325168-1394-154044082942205/AnsiballZ_systemd.py'
Oct 02 07:58:58 compute-0 sudo[172016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:58:59 compute-0 python3.9[172018]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:58:59 compute-0 systemd[1]: Reloading.
Oct 02 07:58:59 compute-0 systemd-sysv-generator[172044]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:58:59 compute-0 systemd-rc-local-generator[172041]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:58:59 compute-0 systemd[1]: Starting Create netns directory...
Oct 02 07:58:59 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 02 07:58:59 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 02 07:58:59 compute-0 systemd[1]: Finished Create netns directory.
Oct 02 07:58:59 compute-0 sudo[172016]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:00 compute-0 sudo[172208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcyzrrkhjyumftczcpxjszwjjuizkvpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391940.0744803-1414-259958150847205/AnsiballZ_file.py'
Oct 02 07:59:00 compute-0 sudo[172208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:00 compute-0 python3.9[172210]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:59:00 compute-0 sudo[172208]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:01 compute-0 sudo[172360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thpvritvywgwqmrdtginfsgabsprkrlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391940.9989624-1430-251819434297943/AnsiballZ_stat.py'
Oct 02 07:59:01 compute-0 sudo[172360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:01 compute-0 python3.9[172362]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:59:01 compute-0 sudo[172360]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:01 compute-0 sudo[172483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srhhwtmhokfwzucjeakkiflhcxhjaimu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391940.9989624-1430-251819434297943/AnsiballZ_copy.py'
Oct 02 07:59:01 compute-0 sudo[172483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:02 compute-0 python3.9[172485]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759391940.9989624-1430-251819434297943/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:59:02 compute-0 sudo[172483]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:03 compute-0 sudo[172635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfpeargqjdpfpfdumjfdjhlkjmwlgpgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391942.6643949-1464-69704278004510/AnsiballZ_file.py'
Oct 02 07:59:03 compute-0 sudo[172635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:03 compute-0 python3.9[172637]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 07:59:03 compute-0 sudo[172635]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:03 compute-0 sudo[172787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjzlscldtozgjyhpjuzbjgqfmrmgmnmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391943.5987415-1480-168993550345591/AnsiballZ_stat.py'
Oct 02 07:59:03 compute-0 sudo[172787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:04 compute-0 python3.9[172789]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:59:04 compute-0 sudo[172787]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:04 compute-0 sudo[172910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppmnrytiksomqurmizugdrxhusnjthlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391943.5987415-1480-168993550345591/AnsiballZ_copy.py'
Oct 02 07:59:04 compute-0 sudo[172910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:04 compute-0 python3.9[172912]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391943.5987415-1480-168993550345591/.source.json _original_basename=.wybpgt63 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:59:04 compute-0 sudo[172910]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:05 compute-0 sudo[173062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gckphgomybgkzezmnnitwxaxfauugbhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391944.973937-1510-261565013973230/AnsiballZ_file.py'
Oct 02 07:59:05 compute-0 sudo[173062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:05 compute-0 python3.9[173064]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:59:05 compute-0 sudo[173062]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:06 compute-0 sudo[173214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnfkkbmtrtosjxsrtxtdnheklsydtvlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391945.849916-1526-86525691272548/AnsiballZ_stat.py'
Oct 02 07:59:06 compute-0 sudo[173214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:06 compute-0 sudo[173214]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:06 compute-0 sudo[173337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsgnzcxluxoffifbkrelkqsbmjfwhtjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391945.849916-1526-86525691272548/AnsiballZ_copy.py'
Oct 02 07:59:06 compute-0 sudo[173337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:07 compute-0 sudo[173337]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:08 compute-0 sudo[173489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqexfthgzvizetljwfetwqrygxnatvum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391947.624183-1560-76032622086493/AnsiballZ_container_config_data.py'
Oct 02 07:59:08 compute-0 sudo[173489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:08 compute-0 python3.9[173491]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct 02 07:59:08 compute-0 sudo[173489]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:08 compute-0 sudo[173641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsiewxnncehnimluhqtolfwolqaeblju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391948.5948741-1578-108747593540481/AnsiballZ_container_config_hash.py'
Oct 02 07:59:08 compute-0 sudo[173641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:09 compute-0 python3.9[173643]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 02 07:59:09 compute-0 sudo[173641]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:09 compute-0 sudo[173793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aryuntnpwwegdwrfyhcblffjrpfixbdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391949.485958-1596-188604296650858/AnsiballZ_podman_container_info.py'
Oct 02 07:59:09 compute-0 sudo[173793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:10 compute-0 python3.9[173795]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 02 07:59:10 compute-0 sudo[173793]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:11 compute-0 sudo[173971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsctpmletvcezidstuwhceejwdvqdkjd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759391951.0307908-1622-39635169171459/AnsiballZ_edpm_container_manage.py'
Oct 02 07:59:11 compute-0 sudo[173971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:11 compute-0 python3[173973]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 02 07:59:11 compute-0 podman[174008]: 2025-10-02 07:59:11.946400318 +0000 UTC m=+0.072496301 container create bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 07:59:11 compute-0 podman[174008]: 2025-10-02 07:59:11.903936963 +0000 UTC m=+0.030032986 image pull 4ee39d2b05f9d7d8e7f025baefe799c33619f4419f4eb27d17ca383a40343475 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct 02 07:59:11 compute-0 python3[173973]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct 02 07:59:12 compute-0 sudo[173971]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:12 compute-0 sudo[174196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srkgsgbtbptjrpyrhhtbpmcbtbmgaswp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391952.3937438-1638-15934188016183/AnsiballZ_stat.py'
Oct 02 07:59:12 compute-0 sudo[174196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:12 compute-0 python3.9[174198]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:59:13 compute-0 sudo[174196]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:13 compute-0 sudo[174350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzsvwvbpqwuvbkmhuircwgyhcebkrulf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391953.3657632-1656-154231520195496/AnsiballZ_file.py'
Oct 02 07:59:13 compute-0 sudo[174350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:13 compute-0 python3.9[174352]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:59:13 compute-0 sudo[174350]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:14 compute-0 sudo[174426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkzvzxfimlyefeihceyxrdisguaoirhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391953.3657632-1656-154231520195496/AnsiballZ_stat.py'
Oct 02 07:59:14 compute-0 sudo[174426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:14 compute-0 python3.9[174428]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:59:14 compute-0 sudo[174426]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:15 compute-0 sudo[174577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmjdyvqpbqvkcdcfsaeccyqdcpmdncen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391954.5043514-1656-189483066189343/AnsiballZ_copy.py'
Oct 02 07:59:15 compute-0 sudo[174577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:15 compute-0 python3.9[174579]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759391954.5043514-1656-189483066189343/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:59:15 compute-0 sudo[174577]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:15 compute-0 sudo[174653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csxyzgmckjonbgaypdfcfldztjnwirhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391954.5043514-1656-189483066189343/AnsiballZ_systemd.py'
Oct 02 07:59:15 compute-0 sudo[174653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:15 compute-0 python3.9[174655]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 07:59:15 compute-0 systemd[1]: Reloading.
Oct 02 07:59:16 compute-0 systemd-sysv-generator[174686]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:59:16 compute-0 systemd-rc-local-generator[174682]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:59:16 compute-0 sudo[174653]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:16 compute-0 sudo[174764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpceshtvpismgrjtvlewydqglzrdolrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391954.5043514-1656-189483066189343/AnsiballZ_systemd.py'
Oct 02 07:59:16 compute-0 sudo[174764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:16 compute-0 python3.9[174766]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:59:16 compute-0 systemd[1]: Reloading.
Oct 02 07:59:17 compute-0 systemd-sysv-generator[174799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:59:17 compute-0 systemd-rc-local-generator[174795]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:59:17 compute-0 systemd[1]: Starting multipathd container...
Oct 02 07:59:17 compute-0 systemd[1]: Started libcrun container.
Oct 02 07:59:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54fa8f38aa659526b1e3047731ab02e4cb69fb725a9c4c9b099a51f6c2f687f1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 02 07:59:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54fa8f38aa659526b1e3047731ab02e4cb69fb725a9c4c9b099a51f6c2f687f1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 02 07:59:17 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b.
Oct 02 07:59:17 compute-0 podman[174806]: 2025-10-02 07:59:17.384825498 +0000 UTC m=+0.119040296 container init bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true)
Oct 02 07:59:17 compute-0 multipathd[174822]: + sudo -E kolla_set_configs
Oct 02 07:59:17 compute-0 podman[174806]: 2025-10-02 07:59:17.419319753 +0000 UTC m=+0.153534521 container start bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 02 07:59:17 compute-0 sudo[174828]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 02 07:59:17 compute-0 sudo[174828]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 02 07:59:17 compute-0 podman[174806]: multipathd
Oct 02 07:59:17 compute-0 sudo[174828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 02 07:59:17 compute-0 systemd[1]: Started multipathd container.
Oct 02 07:59:17 compute-0 multipathd[174822]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 02 07:59:17 compute-0 multipathd[174822]: INFO:__main__:Validating config file
Oct 02 07:59:17 compute-0 multipathd[174822]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 02 07:59:17 compute-0 multipathd[174822]: INFO:__main__:Writing out command to execute
Oct 02 07:59:17 compute-0 sudo[174828]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:17 compute-0 sudo[174764]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:17 compute-0 multipathd[174822]: ++ cat /run_command
Oct 02 07:59:17 compute-0 multipathd[174822]: + CMD='/usr/sbin/multipathd -d'
Oct 02 07:59:17 compute-0 multipathd[174822]: + ARGS=
Oct 02 07:59:17 compute-0 multipathd[174822]: + sudo kolla_copy_cacerts
Oct 02 07:59:17 compute-0 sudo[174842]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 02 07:59:17 compute-0 sudo[174842]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 02 07:59:17 compute-0 sudo[174842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 02 07:59:17 compute-0 sudo[174842]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:17 compute-0 multipathd[174822]: + [[ ! -n '' ]]
Oct 02 07:59:17 compute-0 multipathd[174822]: + . kolla_extend_start
Oct 02 07:59:17 compute-0 multipathd[174822]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 02 07:59:17 compute-0 multipathd[174822]: Running command: '/usr/sbin/multipathd -d'
Oct 02 07:59:17 compute-0 multipathd[174822]: + umask 0022
Oct 02 07:59:17 compute-0 multipathd[174822]: + exec /usr/sbin/multipathd -d
Oct 02 07:59:17 compute-0 multipathd[174822]: 2845.163936 | --------start up--------
Oct 02 07:59:17 compute-0 multipathd[174822]: 2845.163953 | read /etc/multipath.conf
Oct 02 07:59:17 compute-0 podman[174829]: 2025-10-02 07:59:17.517944626 +0000 UTC m=+0.082349592 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd)
Oct 02 07:59:17 compute-0 multipathd[174822]: 2845.170564 | path checkers start up
Oct 02 07:59:17 compute-0 systemd[1]: bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b-31c575fd213e2f35.service: Main process exited, code=exited, status=1/FAILURE
Oct 02 07:59:17 compute-0 systemd[1]: bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b-31c575fd213e2f35.service: Failed with result 'exit-code'.
Oct 02 07:59:18 compute-0 python3.9[175010]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 07:59:18 compute-0 sudo[175162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zejopwxfykwcfjuavxnphpoqrnrjjrxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391958.5676796-1728-217162364093190/AnsiballZ_command.py'
Oct 02 07:59:18 compute-0 sudo[175162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:19 compute-0 python3.9[175164]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 07:59:19 compute-0 sudo[175162]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:19 compute-0 sudo[175356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtwzcowivituhbuxbrogcollohmgbkfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391959.4565923-1744-218834235640353/AnsiballZ_systemd.py'
Oct 02 07:59:19 compute-0 sudo[175356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:19 compute-0 podman[175302]: 2025-10-02 07:59:19.852942473 +0000 UTC m=+0.096980172 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Oct 02 07:59:19 compute-0 podman[175301]: 2025-10-02 07:59:19.854699648 +0000 UTC m=+0.093329206 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 07:59:20 compute-0 python3.9[175366]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:59:20 compute-0 systemd[1]: Stopping multipathd container...
Oct 02 07:59:20 compute-0 multipathd[174822]: 2847.881138 | exit (signal)
Oct 02 07:59:20 compute-0 multipathd[174822]: 2847.881204 | --------shut down-------
Oct 02 07:59:20 compute-0 systemd[1]: libpod-bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b.scope: Deactivated successfully.
Oct 02 07:59:20 compute-0 podman[175377]: 2025-10-02 07:59:20.265250374 +0000 UTC m=+0.066351218 container died bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 02 07:59:20 compute-0 systemd[1]: bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b-31c575fd213e2f35.timer: Deactivated successfully.
Oct 02 07:59:20 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b.
Oct 02 07:59:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b-userdata-shm.mount: Deactivated successfully.
Oct 02 07:59:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-54fa8f38aa659526b1e3047731ab02e4cb69fb725a9c4c9b099a51f6c2f687f1-merged.mount: Deactivated successfully.
Oct 02 07:59:20 compute-0 podman[175377]: 2025-10-02 07:59:20.312331275 +0000 UTC m=+0.113432159 container cleanup bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Oct 02 07:59:20 compute-0 podman[175377]: multipathd
Oct 02 07:59:20 compute-0 podman[175403]: multipathd
Oct 02 07:59:20 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct 02 07:59:20 compute-0 systemd[1]: Stopped multipathd container.
Oct 02 07:59:20 compute-0 systemd[1]: Starting multipathd container...
Oct 02 07:59:20 compute-0 systemd[1]: Started libcrun container.
Oct 02 07:59:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54fa8f38aa659526b1e3047731ab02e4cb69fb725a9c4c9b099a51f6c2f687f1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 02 07:59:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54fa8f38aa659526b1e3047731ab02e4cb69fb725a9c4c9b099a51f6c2f687f1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 02 07:59:20 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b.
Oct 02 07:59:20 compute-0 podman[175416]: 2025-10-02 07:59:20.608706389 +0000 UTC m=+0.164702672 container init bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 02 07:59:20 compute-0 multipathd[175431]: + sudo -E kolla_set_configs
Oct 02 07:59:20 compute-0 podman[175416]: 2025-10-02 07:59:20.639656323 +0000 UTC m=+0.195652576 container start bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Oct 02 07:59:20 compute-0 podman[175416]: multipathd
Oct 02 07:59:20 compute-0 sudo[175437]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 02 07:59:20 compute-0 systemd[1]: Started multipathd container.
Oct 02 07:59:20 compute-0 sudo[175437]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 02 07:59:20 compute-0 sudo[175437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 02 07:59:20 compute-0 sudo[175356]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:20 compute-0 multipathd[175431]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 02 07:59:20 compute-0 multipathd[175431]: INFO:__main__:Validating config file
Oct 02 07:59:20 compute-0 multipathd[175431]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 02 07:59:20 compute-0 multipathd[175431]: INFO:__main__:Writing out command to execute
Oct 02 07:59:20 compute-0 sudo[175437]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:20 compute-0 multipathd[175431]: ++ cat /run_command
Oct 02 07:59:20 compute-0 multipathd[175431]: + CMD='/usr/sbin/multipathd -d'
Oct 02 07:59:20 compute-0 multipathd[175431]: + ARGS=
Oct 02 07:59:20 compute-0 multipathd[175431]: + sudo kolla_copy_cacerts
Oct 02 07:59:20 compute-0 sudo[175461]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 02 07:59:20 compute-0 sudo[175461]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 02 07:59:20 compute-0 sudo[175461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 02 07:59:20 compute-0 sudo[175461]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:20 compute-0 multipathd[175431]: + [[ ! -n '' ]]
Oct 02 07:59:20 compute-0 multipathd[175431]: + . kolla_extend_start
Oct 02 07:59:20 compute-0 multipathd[175431]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 02 07:59:20 compute-0 multipathd[175431]: Running command: '/usr/sbin/multipathd -d'
Oct 02 07:59:20 compute-0 multipathd[175431]: + umask 0022
Oct 02 07:59:20 compute-0 multipathd[175431]: + exec /usr/sbin/multipathd -d
Oct 02 07:59:20 compute-0 podman[175438]: 2025-10-02 07:59:20.754575998 +0000 UTC m=+0.095489135 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 02 07:59:20 compute-0 systemd[1]: bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b-202d086a2a3ff711.service: Main process exited, code=exited, status=1/FAILURE
Oct 02 07:59:20 compute-0 systemd[1]: bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b-202d086a2a3ff711.service: Failed with result 'exit-code'.
Oct 02 07:59:20 compute-0 multipathd[175431]: 2848.429394 | --------start up--------
Oct 02 07:59:20 compute-0 multipathd[175431]: 2848.429549 | read /etc/multipath.conf
Oct 02 07:59:20 compute-0 multipathd[175431]: 2848.436969 | path checkers start up
Oct 02 07:59:22 compute-0 sudo[175618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ushcuiiukuchizvxkphxoanmzmhottfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391962.0482068-1760-216950958046684/AnsiballZ_file.py'
Oct 02 07:59:22 compute-0 sudo[175618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:22 compute-0 python3.9[175620]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:59:22 compute-0 sudo[175618]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:23 compute-0 sudo[175782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrwvynzvbcbmlnivomzikwmfbiiciwxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391963.2563474-1784-118978648679715/AnsiballZ_file.py'
Oct 02 07:59:23 compute-0 sudo[175782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:23 compute-0 podman[175744]: 2025-10-02 07:59:23.675118427 +0000 UTC m=+0.089352332 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 07:59:23 compute-0 python3.9[175789]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 02 07:59:23 compute-0 sudo[175782]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:24 compute-0 sudo[175942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbbjxwtnpmkxfozpuhcjfgtpmaogghge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391964.1364968-1800-171445179553596/AnsiballZ_modprobe.py'
Oct 02 07:59:24 compute-0 sudo[175942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:24 compute-0 python3.9[175944]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct 02 07:59:24 compute-0 kernel: Key type psk registered
Oct 02 07:59:24 compute-0 sudo[175942]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:25 compute-0 sudo[176105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bonuemsmghioljxyoqnaenosqynkntyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391964.9519258-1816-261184184618642/AnsiballZ_stat.py'
Oct 02 07:59:25 compute-0 sudo[176105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:25 compute-0 python3.9[176107]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 07:59:25 compute-0 sudo[176105]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:25 compute-0 sudo[176228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bubsijcthdxtccirmtcliyvzgxjsjvzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391964.9519258-1816-261184184618642/AnsiballZ_copy.py'
Oct 02 07:59:25 compute-0 sudo[176228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:26 compute-0 python3.9[176230]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759391964.9519258-1816-261184184618642/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:59:26 compute-0 sudo[176228]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:26 compute-0 sudo[176380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byzbelezgaqjnnjqxfkxytwbfnclrpih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391966.427806-1848-117103558154429/AnsiballZ_lineinfile.py'
Oct 02 07:59:26 compute-0 sudo[176380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:27 compute-0 python3.9[176382]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:59:27 compute-0 sudo[176380]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:27 compute-0 sudo[176532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqlqmeoxxejnteuhoncfjsjwqmnqbswt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391967.3119726-1864-1544522364585/AnsiballZ_systemd.py'
Oct 02 07:59:27 compute-0 sudo[176532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:28 compute-0 python3.9[176534]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 07:59:28 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 02 07:59:28 compute-0 systemd[1]: Stopped Load Kernel Modules.
Oct 02 07:59:28 compute-0 systemd[1]: Stopping Load Kernel Modules...
Oct 02 07:59:28 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 02 07:59:28 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 02 07:59:28 compute-0 sudo[176532]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:28 compute-0 sudo[176688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwdweuhiggijfpxxmuwrqncqumhkrdoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391968.4766545-1880-172734870747131/AnsiballZ_setup.py'
Oct 02 07:59:28 compute-0 sudo[176688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:29 compute-0 python3.9[176690]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 02 07:59:29 compute-0 sudo[176688]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:30 compute-0 sudo[176772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqikovjkpbufflkgfgcauzgzruxgsruv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391968.4766545-1880-172734870747131/AnsiballZ_dnf.py'
Oct 02 07:59:30 compute-0 sudo[176772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:30 compute-0 python3.9[176774]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 02 07:59:36 compute-0 systemd[1]: Reloading.
Oct 02 07:59:36 compute-0 systemd-rc-local-generator[176807]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:59:36 compute-0 systemd-sysv-generator[176811]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:59:36 compute-0 systemd[1]: Reloading.
Oct 02 07:59:36 compute-0 systemd-rc-local-generator[176840]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:59:36 compute-0 systemd-sysv-generator[176843]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:59:37 compute-0 systemd-logind[827]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 02 07:59:37 compute-0 systemd-logind[827]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 02 07:59:37 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 02 07:59:37 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 02 07:59:37 compute-0 systemd[1]: Reloading.
Oct 02 07:59:37 compute-0 systemd-rc-local-generator[176937]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:59:37 compute-0 systemd-sysv-generator[176940]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:59:37 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 02 07:59:38 compute-0 sudo[176772]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:38 compute-0 sudo[178223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suyaonznuloirzryeoqfpmdyjocudyfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391978.4777186-1904-208571566847003/AnsiballZ_file.py'
Oct 02 07:59:38 compute-0 sudo[178223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:38 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 02 07:59:38 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 02 07:59:38 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.755s CPU time.
Oct 02 07:59:38 compute-0 systemd[1]: run-r60eb86f745154ac0a6f02278e7ffdf26.service: Deactivated successfully.
Oct 02 07:59:39 compute-0 python3.9[178225]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:59:39 compute-0 sudo[178223]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:39 compute-0 python3.9[178376]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 07:59:40 compute-0 sudo[178530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lysspcjdwsnkfpbudeugalunmnjddegl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391980.5845332-1939-112079802587310/AnsiballZ_file.py'
Oct 02 07:59:40 compute-0 sudo[178530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:41 compute-0 python3.9[178532]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:59:41 compute-0 sudo[178530]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:42 compute-0 sudo[178682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugqfgtdsbuvqpisdsjqasxuutnzqnbox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391981.6498754-1961-50471067626806/AnsiballZ_systemd_service.py'
Oct 02 07:59:42 compute-0 sudo[178682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:42 compute-0 python3.9[178684]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 07:59:42 compute-0 systemd[1]: Reloading.
Oct 02 07:59:42 compute-0 systemd-sysv-generator[178715]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 07:59:42 compute-0 systemd-rc-local-generator[178712]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 07:59:43 compute-0 sudo[178682]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:43 compute-0 python3.9[178870]: ansible-ansible.builtin.service_facts Invoked
Oct 02 07:59:43 compute-0 network[178887]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 02 07:59:43 compute-0 network[178888]: 'network-scripts' will be removed from distribution in near future.
Oct 02 07:59:43 compute-0 network[178889]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 02 07:59:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:59:45.957 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 07:59:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:59:45.959 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 07:59:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 07:59:45.959 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 07:59:48 compute-0 sudo[179164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yughgcwprdkuwzgbxnynuqpyxykucvco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391988.53215-1999-137486969972257/AnsiballZ_systemd_service.py'
Oct 02 07:59:48 compute-0 sudo[179164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:49 compute-0 python3.9[179166]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:59:49 compute-0 sudo[179164]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:49 compute-0 sudo[179317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zszjusagylniteqztizvkkegnyvbobls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391989.521308-1999-217415683080359/AnsiballZ_systemd_service.py'
Oct 02 07:59:49 compute-0 sudo[179317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:49 compute-0 podman[179319]: 2025-10-02 07:59:49.988101546 +0000 UTC m=+0.088211006 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct 02 07:59:50 compute-0 podman[179320]: 2025-10-02 07:59:50.002903781 +0000 UTC m=+0.103257489 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 07:59:50 compute-0 python3.9[179321]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:59:50 compute-0 sudo[179317]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:50 compute-0 sudo[179513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owcbajxwntyctmulffmilrkpxlvblzwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391990.4387596-1999-187285227919638/AnsiballZ_systemd_service.py'
Oct 02 07:59:50 compute-0 sudo[179513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:50 compute-0 podman[179515]: 2025-10-02 07:59:50.955447528 +0000 UTC m=+0.105827301 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct 02 07:59:51 compute-0 python3.9[179516]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:59:51 compute-0 sudo[179513]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:51 compute-0 sudo[179687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrcdxjyastcljuhzomdwfpmlhayslbrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391991.36331-1999-10066300706881/AnsiballZ_systemd_service.py'
Oct 02 07:59:51 compute-0 sudo[179687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:52 compute-0 python3.9[179689]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:59:53 compute-0 sudo[179687]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:53 compute-0 sudo[179840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhyglganubgldhehvjdlsyquwfryynfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391993.3173714-1999-50243633029003/AnsiballZ_systemd_service.py'
Oct 02 07:59:53 compute-0 sudo[179840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:53 compute-0 podman[179842]: 2025-10-02 07:59:53.846656192 +0000 UTC m=+0.070974304 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 02 07:59:54 compute-0 python3.9[179843]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:59:54 compute-0 sudo[179840]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:54 compute-0 sudo[180014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzytztqruxtrcshzykypicdokxkkrlxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391994.3077743-1999-49413198401650/AnsiballZ_systemd_service.py'
Oct 02 07:59:54 compute-0 sudo[180014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:54 compute-0 python3.9[180016]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:59:55 compute-0 sudo[180014]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:55 compute-0 sudo[180167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apezuipdrzvuxffrinhbdugeufcelqvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391995.1777124-1999-115234294828233/AnsiballZ_systemd_service.py'
Oct 02 07:59:55 compute-0 sudo[180167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:55 compute-0 python3.9[180169]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:59:55 compute-0 sudo[180167]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:56 compute-0 sudo[180320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnucgvogdbaumcnkniocagdqawyiqntx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391996.0883212-1999-277472000439054/AnsiballZ_systemd_service.py'
Oct 02 07:59:56 compute-0 sudo[180320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:56 compute-0 python3.9[180322]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 07:59:56 compute-0 sudo[180320]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:57 compute-0 sudo[180473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piptysqcugpetjuxhlobisspoccyucvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391997.5142534-2117-61886257858059/AnsiballZ_file.py'
Oct 02 07:59:57 compute-0 sudo[180473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:58 compute-0 python3.9[180475]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:59:58 compute-0 sudo[180473]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:58 compute-0 sudo[180625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzutarfpmcsessschqbeuwndnoqwltgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391998.2858946-2117-92842676002207/AnsiballZ_file.py'
Oct 02 07:59:58 compute-0 sudo[180625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:58 compute-0 python3.9[180627]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:59:58 compute-0 sudo[180625]: pam_unix(sudo:session): session closed for user root
Oct 02 07:59:59 compute-0 sudo[180777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqlxrpfbchyopebdbkqcfebzmpfefues ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391999.0230951-2117-182376927011255/AnsiballZ_file.py'
Oct 02 07:59:59 compute-0 sudo[180777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 07:59:59 compute-0 python3.9[180779]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 07:59:59 compute-0 sudo[180777]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:00 compute-0 sudo[180929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxivbzurbefmgqgitzyezcahuzsgccxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759391999.8105319-2117-268856313356693/AnsiballZ_file.py'
Oct 02 08:00:00 compute-0 sudo[180929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:00 compute-0 python3.9[180931]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:00 compute-0 sudo[180929]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:00 compute-0 sudo[181081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wszbupvdslqurctjdlldrfelxytxgdzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392000.640588-2117-255209935033487/AnsiballZ_file.py'
Oct 02 08:00:00 compute-0 sudo[181081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:01 compute-0 python3.9[181083]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:01 compute-0 sudo[181081]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:01 compute-0 sudo[181233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpfztmmfhbuiukeffvuavmalccgzmeah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392001.3791885-2117-227930198443107/AnsiballZ_file.py'
Oct 02 08:00:01 compute-0 sudo[181233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:01 compute-0 python3.9[181235]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:01 compute-0 sudo[181233]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:02 compute-0 sudo[181385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nswoxutcllkaehtgtxxlcuskubwwehfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392002.0915387-2117-105213334934933/AnsiballZ_file.py'
Oct 02 08:00:02 compute-0 sudo[181385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:02 compute-0 python3.9[181387]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:02 compute-0 sudo[181385]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:03 compute-0 sudo[181537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oujonvafghpgpncahesjbhpsovtcvyvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392002.8422964-2117-195843171557897/AnsiballZ_file.py'
Oct 02 08:00:03 compute-0 sudo[181537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:03 compute-0 python3.9[181539]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:03 compute-0 sudo[181537]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:04 compute-0 sudo[181689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epwnejyahwtefomeunwjneejhsyoggmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392003.766189-2231-273241605892740/AnsiballZ_file.py'
Oct 02 08:00:04 compute-0 sudo[181689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:04 compute-0 python3.9[181691]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:04 compute-0 sudo[181689]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:04 compute-0 sudo[181841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plqmvzlbnumllxaagptnwractgtewfem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392004.4289563-2231-165909067197859/AnsiballZ_file.py'
Oct 02 08:00:04 compute-0 sudo[181841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:04 compute-0 python3.9[181843]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:04 compute-0 sudo[181841]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:05 compute-0 sudo[181993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkfqmiphnopyjunrgnwrvjzriuotfgen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392005.1135592-2231-257137994565626/AnsiballZ_file.py'
Oct 02 08:00:05 compute-0 sudo[181993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:05 compute-0 python3.9[181995]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:05 compute-0 sudo[181993]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:06 compute-0 sudo[182145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dczesujijzwofpybjhmvdapzfkczjecm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392005.8751202-2231-34748413439019/AnsiballZ_file.py'
Oct 02 08:00:06 compute-0 sudo[182145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:06 compute-0 python3.9[182147]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:06 compute-0 sudo[182145]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:06 compute-0 sudo[182297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txpkihlnpbhjaqtjlkyegazvxrguojbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392006.5985444-2231-53115799345397/AnsiballZ_file.py'
Oct 02 08:00:07 compute-0 sudo[182297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:07 compute-0 python3.9[182299]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:07 compute-0 sudo[182297]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:07 compute-0 sudo[182449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyjbwrsgrvfdwgqlrkmjggiskkqywlzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392007.3974786-2231-240617547749488/AnsiballZ_file.py'
Oct 02 08:00:07 compute-0 sudo[182449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:08 compute-0 python3.9[182451]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:08 compute-0 sudo[182449]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:08 compute-0 sudo[182601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmamplzpimtdbvvdbwvnbmnjbgcfpina ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392008.2841427-2231-230298385257545/AnsiballZ_file.py'
Oct 02 08:00:08 compute-0 sudo[182601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:08 compute-0 python3.9[182603]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:08 compute-0 sudo[182601]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:09 compute-0 sudo[182753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvgpxoffgtspteumolyypbltnsglrmzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392009.0325105-2231-128200508747974/AnsiballZ_file.py'
Oct 02 08:00:09 compute-0 sudo[182753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:09 compute-0 python3.9[182755]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:09 compute-0 sudo[182753]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:10 compute-0 sudo[182905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okvjyvipmknxtjuoujsndhxhrhaiwdjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392010.0911121-2347-158610834445836/AnsiballZ_command.py'
Oct 02 08:00:10 compute-0 sudo[182905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:10 compute-0 python3.9[182907]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:00:10 compute-0 sudo[182905]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:11 compute-0 python3.9[183059]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 02 08:00:12 compute-0 sudo[183209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ockwmftdbaafzoikpkwiyokiswfcnqoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392012.028726-2383-170011824339705/AnsiballZ_systemd_service.py'
Oct 02 08:00:12 compute-0 sudo[183209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:12 compute-0 python3.9[183211]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 08:00:12 compute-0 systemd[1]: Reloading.
Oct 02 08:00:12 compute-0 systemd-rc-local-generator[183238]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 08:00:12 compute-0 systemd-sysv-generator[183241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 08:00:13 compute-0 sudo[183209]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:13 compute-0 sudo[183396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tijzzbapsnfwweskldznfalynslmfhdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392013.303232-2399-250727390986976/AnsiballZ_command.py'
Oct 02 08:00:13 compute-0 sudo[183396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:13 compute-0 python3.9[183398]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:00:13 compute-0 sudo[183396]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:14 compute-0 sudo[183549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vessapwefotuqlqtbwbkehccbjzluaop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392014.2161703-2399-144677848710534/AnsiballZ_command.py'
Oct 02 08:00:14 compute-0 sudo[183549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:14 compute-0 python3.9[183551]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:00:14 compute-0 sudo[183549]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:15 compute-0 sudo[183702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liohsngdblpnyrnacomkbiqrhhxoymgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392015.022176-2399-128725423643096/AnsiballZ_command.py'
Oct 02 08:00:15 compute-0 sudo[183702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:15 compute-0 python3.9[183704]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:00:15 compute-0 sudo[183702]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:16 compute-0 sudo[183855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lptrxmtjqczfojvymlvhhkbhkhnoorjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392015.809012-2399-41173792201291/AnsiballZ_command.py'
Oct 02 08:00:16 compute-0 sudo[183855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:16 compute-0 python3.9[183857]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:00:16 compute-0 sudo[183855]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:16 compute-0 sudo[184008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdhwhgdipqivhlxdraduanhsxodlasdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392016.5696986-2399-195039102187873/AnsiballZ_command.py'
Oct 02 08:00:16 compute-0 sudo[184008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:17 compute-0 python3.9[184010]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:00:17 compute-0 sudo[184008]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:17 compute-0 sudo[184161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okzclgoideldvpfesgfmgravdrfnqjjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392017.2891047-2399-268484949290821/AnsiballZ_command.py'
Oct 02 08:00:17 compute-0 sudo[184161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:17 compute-0 python3.9[184163]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:00:17 compute-0 sudo[184161]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:18 compute-0 sudo[184314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqwdwecppbavprwvmzvgsmcziebzkgpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392017.925162-2399-107961273634774/AnsiballZ_command.py'
Oct 02 08:00:18 compute-0 sudo[184314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:18 compute-0 python3.9[184316]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:00:18 compute-0 sudo[184314]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:18 compute-0 sudo[184467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfephaeghjmrohrdhfhdybegecphlgwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392018.5707889-2399-224945988306182/AnsiballZ_command.py'
Oct 02 08:00:18 compute-0 sudo[184467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:19 compute-0 python3.9[184469]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:00:19 compute-0 sudo[184467]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:20 compute-0 podman[184495]: 2025-10-02 08:00:20.159351817 +0000 UTC m=+0.073064569 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:00:20 compute-0 podman[184496]: 2025-10-02 08:00:20.251826832 +0000 UTC m=+0.156619933 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Oct 02 08:00:20 compute-0 sudo[184665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixhyokjgazmxgcycemuvdjsukpsqzabp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392020.2821283-2542-111864516910748/AnsiballZ_file.py'
Oct 02 08:00:20 compute-0 sudo[184665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:20 compute-0 python3.9[184667]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:20 compute-0 sudo[184665]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:21 compute-0 podman[184713]: 2025-10-02 08:00:21.196110118 +0000 UTC m=+0.092568669 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:00:21 compute-0 sudo[184838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvbbbhskozhiowyguioulrfwglhnmeqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392021.077149-2542-7132546178199/AnsiballZ_file.py'
Oct 02 08:00:21 compute-0 sudo[184838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:21 compute-0 python3.9[184840]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:21 compute-0 sudo[184838]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:22 compute-0 sudo[184990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agubqwylptdhzvrwltxvgpecneyzcyem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392021.8329444-2542-166010076044144/AnsiballZ_file.py'
Oct 02 08:00:22 compute-0 sudo[184990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:22 compute-0 python3.9[184992]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:22 compute-0 sudo[184990]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:23 compute-0 sudo[185142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jarnfkdrzwujouehvefyabixvefuxdek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392022.7944717-2586-59067526497424/AnsiballZ_file.py'
Oct 02 08:00:23 compute-0 sudo[185142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:23 compute-0 python3.9[185144]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:23 compute-0 sudo[185142]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:24 compute-0 sudo[185308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsyixzwrpjmszerysdhtqpfdqirxlmlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392023.601581-2586-14107061663266/AnsiballZ_file.py'
Oct 02 08:00:24 compute-0 sudo[185308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:24 compute-0 podman[185268]: 2025-10-02 08:00:24.024594705 +0000 UTC m=+0.091274109 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 02 08:00:24 compute-0 python3.9[185313]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:24 compute-0 sudo[185308]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:24 compute-0 sudo[185463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uytwkcljecwvdipinzipyxkqorimjamg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392024.4836102-2586-222078228548194/AnsiballZ_file.py'
Oct 02 08:00:24 compute-0 sudo[185463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:25 compute-0 python3.9[185465]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:25 compute-0 sudo[185463]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:25 compute-0 sudo[185615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqzzzgxgrdlwuydobszuwqwsmtmjxgpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392025.2096713-2586-17500277111047/AnsiballZ_file.py'
Oct 02 08:00:25 compute-0 sudo[185615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:25 compute-0 python3.9[185617]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:25 compute-0 sudo[185615]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:26 compute-0 sudo[185767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enqyxmfyfjnxisqmtvseenpvgnqqgscm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392025.9823933-2586-13260766160919/AnsiballZ_file.py'
Oct 02 08:00:26 compute-0 sudo[185767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:26 compute-0 python3.9[185769]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:26 compute-0 sudo[185767]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:27 compute-0 sudo[185919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpnslzsyxulugfkxesblxhpfvtkxwinx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392026.8586404-2586-123831489829223/AnsiballZ_file.py'
Oct 02 08:00:27 compute-0 sudo[185919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:27 compute-0 python3.9[185921]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:27 compute-0 sudo[185919]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:28 compute-0 sudo[186071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwruhupgwyoghwomyqiczigdbbzdkkpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392027.6531217-2586-259462350222198/AnsiballZ_file.py'
Oct 02 08:00:28 compute-0 sudo[186071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:28 compute-0 python3.9[186073]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:28 compute-0 sudo[186071]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:28 compute-0 sudo[186223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghxtpnujtnrajzbzalrsvlkbppxygwxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392028.4043732-2586-199700764683531/AnsiballZ_file.py'
Oct 02 08:00:28 compute-0 sudo[186223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:28 compute-0 python3.9[186225]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:28 compute-0 sudo[186223]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:29 compute-0 sudo[186375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijxrzeixpwenxhdrlpxciejncdpgptlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392029.1154866-2586-134318459075179/AnsiballZ_file.py'
Oct 02 08:00:29 compute-0 sudo[186375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:29 compute-0 python3.9[186377]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:29 compute-0 sudo[186375]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:34 compute-0 sudo[186527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wujxmxntchxkejibzqmsbpiuqkupkmoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392034.0552497-2851-128519834829128/AnsiballZ_getent.py'
Oct 02 08:00:34 compute-0 sudo[186527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:34 compute-0 python3.9[186529]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct 02 08:00:34 compute-0 sudo[186527]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:35 compute-0 sudo[186680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waastdeevsrjnpcwhagesbbomlgotvlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392035.1037078-2867-201257844451739/AnsiballZ_group.py'
Oct 02 08:00:35 compute-0 sudo[186680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:35 compute-0 python3.9[186682]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 02 08:00:35 compute-0 groupadd[186683]: group added to /etc/group: name=nova, GID=42436
Oct 02 08:00:35 compute-0 groupadd[186683]: group added to /etc/gshadow: name=nova
Oct 02 08:00:35 compute-0 groupadd[186683]: new group: name=nova, GID=42436
Oct 02 08:00:35 compute-0 sudo[186680]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:36 compute-0 sudo[186838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anmrmpxuxblwfhfnhxxnthjjtxhrbhll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392036.2539966-2883-172275744519401/AnsiballZ_user.py'
Oct 02 08:00:36 compute-0 sudo[186838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:37 compute-0 python3.9[186840]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 02 08:00:37 compute-0 useradd[186842]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Oct 02 08:00:37 compute-0 useradd[186842]: add 'nova' to group 'libvirt'
Oct 02 08:00:37 compute-0 useradd[186842]: add 'nova' to shadow group 'libvirt'
Oct 02 08:00:37 compute-0 sudo[186838]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:37 compute-0 sshd-session[186873]: Accepted publickey for zuul from 192.168.122.30 port 41808 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 08:00:38 compute-0 systemd-logind[827]: New session 27 of user zuul.
Oct 02 08:00:38 compute-0 systemd[1]: Started Session 27 of User zuul.
Oct 02 08:00:38 compute-0 sshd-session[186873]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 08:00:38 compute-0 sshd-session[186876]: Received disconnect from 192.168.122.30 port 41808:11: disconnected by user
Oct 02 08:00:38 compute-0 sshd-session[186876]: Disconnected from user zuul 192.168.122.30 port 41808
Oct 02 08:00:38 compute-0 sshd-session[186873]: pam_unix(sshd:session): session closed for user zuul
Oct 02 08:00:38 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Oct 02 08:00:38 compute-0 systemd-logind[827]: Session 27 logged out. Waiting for processes to exit.
Oct 02 08:00:38 compute-0 systemd-logind[827]: Removed session 27.
Oct 02 08:00:38 compute-0 python3.9[187026]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:00:39 compute-0 python3.9[187147]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759392038.4483392-2933-76054232123303/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:40 compute-0 python3.9[187297]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:00:41 compute-0 python3.9[187373]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:41 compute-0 python3.9[187523]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:00:42 compute-0 python3.9[187644]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759392041.3056185-2933-8238658211184/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:43 compute-0 python3.9[187794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:00:43 compute-0 python3.9[187915]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759392042.7012763-2933-19114061156454/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:44 compute-0 python3.9[188065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:00:45 compute-0 python3.9[188186]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759392044.1273942-2933-7861359430629/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:00:45.959 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:00:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:00:45.960 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:00:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:00:45.960 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:00:45 compute-0 sudo[188336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyaefxlomyreikdhvqkmjkspmazielxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392045.6341748-3071-146803139700923/AnsiballZ_file.py'
Oct 02 08:00:46 compute-0 sudo[188336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:46 compute-0 python3.9[188338]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:46 compute-0 sudo[188336]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:46 compute-0 sudo[188488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukeizjyhbfoszgragksqqmfsxblitgqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392046.4575307-3087-252943878177476/AnsiballZ_copy.py'
Oct 02 08:00:46 compute-0 sudo[188488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:47 compute-0 python3.9[188490]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:00:47 compute-0 sudo[188488]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:47 compute-0 sudo[188640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uduexcoxcozlsznnsyxwdudxwpivzjjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392047.2908247-3103-191114262876924/AnsiballZ_stat.py'
Oct 02 08:00:47 compute-0 sudo[188640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:47 compute-0 python3.9[188642]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:00:47 compute-0 sudo[188640]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:48 compute-0 sudo[188792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vroatghldequsbncypxldxukisynsoiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392048.1193504-3119-200792855916824/AnsiballZ_stat.py'
Oct 02 08:00:48 compute-0 sudo[188792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:48 compute-0 python3.9[188794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:00:48 compute-0 sudo[188792]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:49 compute-0 sudo[188915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owahztimbbwbyxckaltxlrfbhqyjbrdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392048.1193504-3119-200792855916824/AnsiballZ_copy.py'
Oct 02 08:00:49 compute-0 sudo[188915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:49 compute-0 python3.9[188917]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759392048.1193504-3119-200792855916824/.source _original_basename=.44hfpkf4 follow=False checksum=d60e3ac4c48105062fcf36b2b2e96f249e477e19 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct 02 08:00:49 compute-0 sudo[188915]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:50 compute-0 python3.9[189069]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:00:51 compute-0 podman[189195]: 2025-10-02 08:00:51.120056855 +0000 UTC m=+0.073706647 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct 02 08:00:51 compute-0 podman[189196]: 2025-10-02 08:00:51.192023238 +0000 UTC m=+0.137360597 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:00:51 compute-0 python3.9[189245]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:00:51 compute-0 podman[189360]: 2025-10-02 08:00:51.773161729 +0000 UTC m=+0.092289041 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:00:51 compute-0 python3.9[189397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759392050.7009737-3171-253060540044138/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=f022386746472553146d29f689b545df70fa8a60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:52 compute-0 python3.9[189555]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:00:53 compute-0 python3.9[189676]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759392052.179811-3201-48040086930749/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:00:54 compute-0 podman[189753]: 2025-10-02 08:00:54.167005809 +0000 UTC m=+0.078399550 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:00:54 compute-0 sudo[189846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbztepjxyfnlbxmzbaalwvyzwpqwjzvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392053.9808059-3235-155681436901846/AnsiballZ_container_config_data.py'
Oct 02 08:00:54 compute-0 sudo[189846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:54 compute-0 python3.9[189848]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct 02 08:00:54 compute-0 sudo[189846]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:55 compute-0 sudo[189998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnpbryyekkhaogizfenftsdtjcserycx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392054.8337233-3253-107218368830086/AnsiballZ_container_config_hash.py'
Oct 02 08:00:55 compute-0 sudo[189998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:55 compute-0 python3.9[190000]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 02 08:00:55 compute-0 sudo[189998]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:56 compute-0 sudo[190150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfnbzyquvpbduekxpckyhoeugdmfpdgz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759392055.825901-3273-17132364992145/AnsiballZ_edpm_container_manage.py'
Oct 02 08:00:56 compute-0 sudo[190150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:56 compute-0 python3[190152]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct 02 08:00:56 compute-0 podman[190189]: 2025-10-02 08:00:56.720057132 +0000 UTC m=+0.059409114 container create 05e1d1c1ed5c4659c691b7799e217a71735860f200a1892da2fcfaa87f3417ce (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, org.label-schema.build-date=20251001, container_name=nova_compute_init, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible)
Oct 02 08:00:56 compute-0 podman[190189]: 2025-10-02 08:00:56.687971258 +0000 UTC m=+0.027323280 image pull cb7a9bebda1404fc92f1415580e7da04b5fcfd160582e38b9b99703a41ed1519 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct 02 08:00:56 compute-0 python3[190152]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct 02 08:00:56 compute-0 sudo[190150]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:57 compute-0 sudo[190377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyuhjskqdkofwuzyuqzdlmdnmujbjrpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392057.240606-3289-179064542588174/AnsiballZ_stat.py'
Oct 02 08:00:57 compute-0 sudo[190377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:57 compute-0 python3.9[190379]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:00:57 compute-0 sudo[190377]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:58 compute-0 sudo[190531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbimwxdqyjdcluqelkbgzkwvuuzqfvyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392058.4888325-3313-59574901949709/AnsiballZ_container_config_data.py'
Oct 02 08:00:58 compute-0 sudo[190531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:00:59 compute-0 python3.9[190533]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct 02 08:00:59 compute-0 sudo[190531]: pam_unix(sudo:session): session closed for user root
Oct 02 08:00:59 compute-0 sudo[190683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqbbrslzfmngxgqvvnyjccenblahymxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392059.4751248-3331-229521576908143/AnsiballZ_container_config_hash.py'
Oct 02 08:00:59 compute-0 sudo[190683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:00 compute-0 python3.9[190685]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 02 08:01:00 compute-0 sudo[190683]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:00 compute-0 sudo[190835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xudtgzuuwsifxwdrgjstqebmflneqrnw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759392060.4266052-3351-251032244509521/AnsiballZ_edpm_container_manage.py'
Oct 02 08:01:00 compute-0 sudo[190835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:01 compute-0 python3[190837]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct 02 08:01:01 compute-0 podman[190875]: 2025-10-02 08:01:01.27600105 +0000 UTC m=+0.068906699 container create efbb211a2f0d2c50bafd3b3e9ede52cf80edfd73757aa793dad9d7e9e4ad46a7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:01:01 compute-0 podman[190875]: 2025-10-02 08:01:01.239203215 +0000 UTC m=+0.032108934 image pull cb7a9bebda1404fc92f1415580e7da04b5fcfd160582e38b9b99703a41ed1519 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct 02 08:01:01 compute-0 python3[190837]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Oct 02 08:01:01 compute-0 CROND[190914]: (root) CMD (run-parts /etc/cron.hourly)
Oct 02 08:01:01 compute-0 run-parts[190917]: (/etc/cron.hourly) starting 0anacron
Oct 02 08:01:01 compute-0 sudo[190835]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:01 compute-0 anacron[190925]: Anacron started on 2025-10-02
Oct 02 08:01:01 compute-0 anacron[190925]: Will run job `cron.daily' in 26 min.
Oct 02 08:01:01 compute-0 anacron[190925]: Will run job `cron.weekly' in 46 min.
Oct 02 08:01:01 compute-0 anacron[190925]: Will run job `cron.monthly' in 66 min.
Oct 02 08:01:01 compute-0 anacron[190925]: Jobs will be executed sequentially
Oct 02 08:01:01 compute-0 run-parts[190928]: (/etc/cron.hourly) finished 0anacron
Oct 02 08:01:01 compute-0 CROND[190913]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 02 08:01:02 compute-0 sudo[191077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uobugrcpqqcshuoawjqyowqwkmtimjoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392061.7066848-3367-198549903292566/AnsiballZ_stat.py'
Oct 02 08:01:02 compute-0 sudo[191077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:02 compute-0 python3.9[191079]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:01:02 compute-0 sudo[191077]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:03 compute-0 sudo[191231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvkazbueuzbnwjzfzxqnrpiotdqagukz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392062.6438277-3385-10123552838993/AnsiballZ_file.py'
Oct 02 08:01:03 compute-0 sudo[191231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:03 compute-0 python3.9[191233]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:01:03 compute-0 sudo[191231]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:03 compute-0 sudo[191382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxbpjnybtjflnrhdktppcatfznfifmjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392063.3956642-3385-134972717832114/AnsiballZ_copy.py'
Oct 02 08:01:03 compute-0 sudo[191382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:04 compute-0 python3.9[191384]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759392063.3956642-3385-134972717832114/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:01:04 compute-0 sudo[191382]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:04 compute-0 sudo[191458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxkwamoqdvfejlsovczzbcloskzfmctq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392063.3956642-3385-134972717832114/AnsiballZ_systemd.py'
Oct 02 08:01:04 compute-0 sudo[191458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:04 compute-0 python3.9[191460]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 08:01:04 compute-0 systemd[1]: Reloading.
Oct 02 08:01:04 compute-0 systemd-rc-local-generator[191488]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 08:01:04 compute-0 systemd-sysv-generator[191492]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 08:01:05 compute-0 sudo[191458]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:05 compute-0 sudo[191568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjloersnsimsivboabkpbecpwodmrqsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392063.3956642-3385-134972717832114/AnsiballZ_systemd.py'
Oct 02 08:01:05 compute-0 sudo[191568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:05 compute-0 python3.9[191570]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 08:01:05 compute-0 systemd[1]: Reloading.
Oct 02 08:01:06 compute-0 systemd-rc-local-generator[191596]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 08:01:06 compute-0 systemd-sysv-generator[191603]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 08:01:06 compute-0 systemd[1]: Starting nova_compute container...
Oct 02 08:01:06 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:01:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c9f28386cafe76f5075f6a8a2e6f8f49527a9107faa3c67922313d8ff76e0f/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c9f28386cafe76f5075f6a8a2e6f8f49527a9107faa3c67922313d8ff76e0f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c9f28386cafe76f5075f6a8a2e6f8f49527a9107faa3c67922313d8ff76e0f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c9f28386cafe76f5075f6a8a2e6f8f49527a9107faa3c67922313d8ff76e0f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c9f28386cafe76f5075f6a8a2e6f8f49527a9107faa3c67922313d8ff76e0f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:06 compute-0 podman[191609]: 2025-10-02 08:01:06.416517578 +0000 UTC m=+0.149531158 container init efbb211a2f0d2c50bafd3b3e9ede52cf80edfd73757aa793dad9d7e9e4ad46a7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 02 08:01:06 compute-0 podman[191609]: 2025-10-02 08:01:06.431290061 +0000 UTC m=+0.164303581 container start efbb211a2f0d2c50bafd3b3e9ede52cf80edfd73757aa793dad9d7e9e4ad46a7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 02 08:01:06 compute-0 podman[191609]: nova_compute
Oct 02 08:01:06 compute-0 nova_compute[191624]: + sudo -E kolla_set_configs
Oct 02 08:01:06 compute-0 systemd[1]: Started nova_compute container.
Oct 02 08:01:06 compute-0 sudo[191568]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Validating config file
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Copying service configuration files
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Deleting /etc/ceph
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Creating directory /etc/ceph
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Setting permission for /etc/ceph
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Writing out command to execute
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 02 08:01:06 compute-0 nova_compute[191624]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 02 08:01:06 compute-0 nova_compute[191624]: ++ cat /run_command
Oct 02 08:01:06 compute-0 nova_compute[191624]: + CMD=nova-compute
Oct 02 08:01:06 compute-0 nova_compute[191624]: + ARGS=
Oct 02 08:01:06 compute-0 nova_compute[191624]: + sudo kolla_copy_cacerts
Oct 02 08:01:06 compute-0 nova_compute[191624]: + [[ ! -n '' ]]
Oct 02 08:01:06 compute-0 nova_compute[191624]: + . kolla_extend_start
Oct 02 08:01:06 compute-0 nova_compute[191624]: + echo 'Running command: '\''nova-compute'\'''
Oct 02 08:01:06 compute-0 nova_compute[191624]: Running command: 'nova-compute'
Oct 02 08:01:06 compute-0 nova_compute[191624]: + umask 0022
Oct 02 08:01:06 compute-0 nova_compute[191624]: + exec nova-compute
Oct 02 08:01:07 compute-0 python3.9[191786]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:01:08 compute-0 nova_compute[191624]: 2025-10-02 08:01:08.448 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 02 08:01:08 compute-0 nova_compute[191624]: 2025-10-02 08:01:08.448 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 02 08:01:08 compute-0 nova_compute[191624]: 2025-10-02 08:01:08.448 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 02 08:01:08 compute-0 nova_compute[191624]: 2025-10-02 08:01:08.448 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 02 08:01:08 compute-0 python3.9[191936]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:01:08 compute-0 nova_compute[191624]: 2025-10-02 08:01:08.578 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:01:08 compute-0 nova_compute[191624]: 2025-10-02 08:01:08.606 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.080 2 INFO nova.virt.driver [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.225 2 INFO nova.compute.provider_config [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.242 2 DEBUG oslo_concurrency.lockutils [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.243 2 DEBUG oslo_concurrency.lockutils [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.243 2 DEBUG oslo_concurrency.lockutils [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.244 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.244 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.244 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.244 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.245 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.245 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.245 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.245 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.246 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.246 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.246 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.246 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.246 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.247 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.247 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.247 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.247 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.248 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.248 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.248 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.248 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.248 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.249 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.249 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.249 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.249 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.249 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.250 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.250 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.250 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.250 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.251 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.251 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.251 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.251 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.252 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.252 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.252 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.252 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.253 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.253 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.253 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.253 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.253 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.254 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.254 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.254 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.254 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.254 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.255 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.255 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.255 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.255 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.256 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.256 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.256 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.256 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.256 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.257 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.257 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.257 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.257 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.258 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.258 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.258 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.258 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.258 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.259 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.259 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.259 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.259 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.259 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.260 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.260 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.260 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.260 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.260 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.261 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.261 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.261 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.261 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.262 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.262 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.262 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.262 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.263 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.263 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.263 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.263 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.263 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.264 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.264 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.264 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.264 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.264 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.265 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.265 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.265 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.265 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.265 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.266 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.266 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.267 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.267 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.268 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.268 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.269 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.269 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.269 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.270 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.270 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.270 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.271 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.271 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.271 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.271 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.271 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.272 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.272 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.272 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.272 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.273 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.273 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.273 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.273 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.273 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.274 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.274 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.274 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.274 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.274 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.275 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.275 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.275 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.275 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.275 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.276 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.276 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.276 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.276 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.276 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.277 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.277 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.277 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.277 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.277 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.278 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.278 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.278 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.278 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.279 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.279 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.279 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.279 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.279 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.280 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.280 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.280 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.280 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.280 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.281 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.281 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.281 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.281 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.282 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.282 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.282 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.282 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.282 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.283 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.283 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.283 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.283 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.283 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.284 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.284 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.284 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.284 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.284 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.285 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.285 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.285 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.285 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.286 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.286 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.286 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.286 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.286 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.287 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.287 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.287 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.287 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.287 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.288 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.288 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.288 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.288 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.288 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.289 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.289 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.289 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.289 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.289 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.290 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.290 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.290 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.290 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.290 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.291 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.291 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.291 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.291 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.292 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.292 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.292 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.292 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.292 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.293 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.293 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.293 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.293 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.293 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.294 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.294 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.295 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.295 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.295 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.295 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.295 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.296 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.296 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.296 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.296 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.296 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.297 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.297 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.297 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.297 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.297 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.298 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.298 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.298 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.298 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.298 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.299 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.299 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.299 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.299 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.299 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.299 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.299 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.299 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.300 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.300 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.300 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.300 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.300 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.300 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.300 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.300 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.301 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.301 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.301 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.301 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.301 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.301 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.301 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.302 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.302 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.302 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.302 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.302 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.302 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.302 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.303 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.303 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.303 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.303 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.303 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.303 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.303 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.303 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.304 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.304 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.304 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.304 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.304 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.304 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.304 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.305 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.305 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.305 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.305 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.305 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.305 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.305 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.305 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.306 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.306 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.306 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.306 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.306 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.306 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.306 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.307 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.307 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.307 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.307 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.307 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.307 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.307 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.308 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.308 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.308 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.308 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.308 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.308 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.308 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.308 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.309 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.309 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.309 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.309 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.309 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.309 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.309 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.310 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.310 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.310 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.310 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.310 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.310 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.310 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.310 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.311 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.311 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.311 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.311 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.311 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.311 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.311 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.312 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.312 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.312 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.312 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.312 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.312 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.312 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.313 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.313 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.313 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.313 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.313 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.313 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.313 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.314 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.314 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.314 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.314 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.314 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.314 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.315 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.315 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.315 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.315 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.315 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.315 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.315 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.315 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.316 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.316 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.316 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.316 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.316 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.316 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.316 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.316 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.317 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.317 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.317 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.317 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.317 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.317 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.317 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.318 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.318 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.318 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.318 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.318 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.318 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.318 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.319 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.319 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.319 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.319 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.319 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.319 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.319 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.319 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.320 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.320 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.320 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.320 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.320 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.320 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.320 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.321 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.321 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.321 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.321 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.321 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.321 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.321 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.321 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.322 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.322 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.322 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.322 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.322 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.322 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.322 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.323 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.323 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.323 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.323 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.323 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.323 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.323 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.323 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.324 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.324 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.324 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.324 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.324 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.325 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.325 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.325 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.325 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.325 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.325 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.325 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.326 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.326 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.326 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.326 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.326 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.326 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.326 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.327 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.327 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.327 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.327 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.327 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.327 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.327 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.328 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.328 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.328 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.328 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.328 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.328 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.328 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.328 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.329 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.329 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.329 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.329 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.329 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.329 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.329 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.330 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.330 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.330 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.330 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.330 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.330 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.330 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.330 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.331 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.331 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.331 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.331 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.331 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.331 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.331 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.332 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.332 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.332 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.332 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.332 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.332 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.333 2 WARNING oslo_config.cfg [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 02 08:01:09 compute-0 nova_compute[191624]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 02 08:01:09 compute-0 nova_compute[191624]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 02 08:01:09 compute-0 nova_compute[191624]: and ``live_migration_inbound_addr`` respectively.
Oct 02 08:01:09 compute-0 nova_compute[191624]: ).  Its value may be silently ignored in the future.
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.333 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.333 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.333 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.333 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.333 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.334 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.334 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.334 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.334 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.334 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.334 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.334 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.335 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.335 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.335 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.335 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.335 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.335 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.335 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.335 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.336 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.336 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.336 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.336 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.336 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.336 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.336 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.337 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.337 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.337 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.337 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.337 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.337 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.337 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.338 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.338 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.338 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.338 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.338 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.338 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.338 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.339 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.339 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.339 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.339 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.339 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.339 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.339 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.340 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.340 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.340 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.340 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.340 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.340 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.340 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.341 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.341 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.341 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.341 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.341 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.341 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.341 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.341 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.342 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.342 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.342 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.342 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.342 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.342 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.342 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.343 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.343 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.343 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.343 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.343 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.343 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.343 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.343 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.344 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.344 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.344 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.344 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.344 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.344 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.344 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.345 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.345 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.345 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.345 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.345 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.345 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.345 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.345 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.346 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.346 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.346 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.346 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.346 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.346 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.346 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.347 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.347 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.347 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.347 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.347 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.347 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.347 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.347 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.348 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.348 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.348 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.348 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.348 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.348 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.348 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.349 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.349 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.349 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.349 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.349 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.349 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.349 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.350 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.350 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.350 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.350 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.350 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.350 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.350 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.350 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.351 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.351 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.351 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.351 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.351 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.351 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.351 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.352 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.352 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.352 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.352 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.352 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.353 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.353 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.353 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.353 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.353 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.353 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.353 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.353 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.354 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.354 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.354 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.354 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.354 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.354 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.354 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.355 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.355 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.355 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.355 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.355 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.355 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.355 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.356 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.356 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.356 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.356 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.356 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.356 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.356 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.357 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.357 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.357 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.357 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.357 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.357 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.357 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.357 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.358 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.358 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.358 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.358 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.358 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.358 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.358 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.359 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.359 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.359 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.359 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.359 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.359 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.359 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.360 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.360 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.360 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.360 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.360 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.360 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.360 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.361 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.361 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.361 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.361 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.361 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.361 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.361 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.362 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.362 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.362 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.363 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.363 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.363 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.363 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.363 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.363 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.363 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.363 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.364 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.364 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.364 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.364 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.364 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.364 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.364 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.365 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.365 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.365 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.365 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.365 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.365 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.365 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.365 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.366 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.366 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.366 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.366 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.366 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.366 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.366 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.366 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.367 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.367 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.367 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.367 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.367 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.367 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.368 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.368 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.368 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.368 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.368 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.368 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.369 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.369 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.369 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.369 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.369 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.369 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.369 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.369 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.370 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.370 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.370 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.370 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.370 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.370 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.370 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.371 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.371 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.371 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.371 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.371 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.371 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.371 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.372 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.372 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.372 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.372 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.372 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.372 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.372 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.372 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.373 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.373 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.373 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.373 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.373 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.373 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.373 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.374 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.374 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.374 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.374 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.374 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.374 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.374 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.375 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.375 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.375 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.375 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.375 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.375 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.375 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.376 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.376 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.376 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.376 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.376 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.376 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.376 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.377 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.377 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.377 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.377 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.377 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.377 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.377 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.377 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.378 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.378 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.378 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.378 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.378 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.378 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.378 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.379 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.379 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.379 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.379 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.379 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.379 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.379 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.379 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.380 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.380 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.380 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.380 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.380 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.380 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.380 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.381 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.381 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.381 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.381 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.381 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.381 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.381 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.382 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.382 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.382 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.382 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.382 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.382 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.382 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.383 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.383 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.383 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.383 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.383 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.383 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.383 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.383 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.384 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.384 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.384 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.384 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.384 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.384 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.385 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.385 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.385 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.385 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.385 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.385 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.385 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.386 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.386 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.386 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.386 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.386 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.386 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.386 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.387 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.387 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.387 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.387 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.387 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.387 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.387 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.387 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.388 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.388 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.388 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.388 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.388 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.388 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.388 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.389 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.389 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.389 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.389 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.389 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.389 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.389 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.390 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.390 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.390 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.390 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.390 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.390 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.390 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.390 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.391 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.391 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.391 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.391 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.391 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.391 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.391 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.392 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.392 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.392 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.392 2 DEBUG oslo_service.service [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.393 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Oct 02 08:01:09 compute-0 python3.9[192090]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.411 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.412 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.412 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.412 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Oct 02 08:01:09 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Oct 02 08:01:09 compute-0 systemd[1]: Started libvirt QEMU daemon.
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.498 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f37040b1250> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.502 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f37040b1250> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.503 2 INFO nova.virt.libvirt.driver [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Connection event '1' reason 'None'
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.522 2 WARNING nova.virt.libvirt.driver [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 02 08:01:09 compute-0 nova_compute[191624]: 2025-10-02 08:01:09.522 2 DEBUG nova.virt.libvirt.volume.mount [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 02 08:01:10 compute-0 sudo[192292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndjbogwjokinemuevnolmzpdxzqtvatu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392069.7019353-3505-100730440411982/AnsiballZ_podman_container.py'
Oct 02 08:01:10 compute-0 sudo[192292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:10 compute-0 python3.9[192296]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 02 08:01:10 compute-0 sudo[192292]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:10 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.504 2 INFO nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Libvirt host capabilities <capabilities>
Oct 02 08:01:10 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 02 08:01:10 compute-0 nova_compute[191624]: 
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <host>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <uuid>b127680e-a52a-46b0-96f2-3ca4a3f61658</uuid>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <cpu>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <arch>x86_64</arch>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model>EPYC-Rome-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <vendor>AMD</vendor>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <microcode version='16777317'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <signature family='23' model='49' stepping='0'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <maxphysaddr mode='emulate' bits='40'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='x2apic'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='tsc-deadline'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='osxsave'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='hypervisor'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='tsc_adjust'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='spec-ctrl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='stibp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='arch-capabilities'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='cmp_legacy'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='topoext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='virt-ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='lbrv'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='tsc-scale'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='vmcb-clean'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='pause-filter'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='pfthreshold'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='svme-addr-chk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='rdctl-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='skip-l1dfl-vmentry'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='mds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature name='pschange-mc-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <pages unit='KiB' size='4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <pages unit='KiB' size='2048'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <pages unit='KiB' size='1048576'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </cpu>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <power_management>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <suspend_mem/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <suspend_disk/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <suspend_hybrid/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </power_management>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <iommu support='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <migration_features>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <live/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <uri_transports>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <uri_transport>tcp</uri_transport>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <uri_transport>rdma</uri_transport>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </uri_transports>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </migration_features>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <topology>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <cells num='1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <cell id='0'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:           <memory unit='KiB'>7864104</memory>
Oct 02 08:01:10 compute-0 nova_compute[191624]:           <pages unit='KiB' size='4'>1966026</pages>
Oct 02 08:01:10 compute-0 nova_compute[191624]:           <pages unit='KiB' size='2048'>0</pages>
Oct 02 08:01:10 compute-0 nova_compute[191624]:           <pages unit='KiB' size='1048576'>0</pages>
Oct 02 08:01:10 compute-0 nova_compute[191624]:           <distances>
Oct 02 08:01:10 compute-0 nova_compute[191624]:             <sibling id='0' value='10'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:           </distances>
Oct 02 08:01:10 compute-0 nova_compute[191624]:           <cpus num='8'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:           </cpus>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         </cell>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </cells>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </topology>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <cache>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </cache>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <secmodel>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model>selinux</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <doi>0</doi>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </secmodel>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <secmodel>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model>dac</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <doi>0</doi>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <baselabel type='kvm'>+107:+107</baselabel>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <baselabel type='qemu'>+107:+107</baselabel>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </secmodel>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </host>
Oct 02 08:01:10 compute-0 nova_compute[191624]: 
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <guest>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <os_type>hvm</os_type>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <arch name='i686'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <wordsize>32</wordsize>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <domain type='qemu'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <domain type='kvm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </arch>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <features>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <pae/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <nonpae/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <acpi default='on' toggle='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <apic default='on' toggle='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <cpuselection/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <deviceboot/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <disksnapshot default='on' toggle='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <externalSnapshot/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </features>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </guest>
Oct 02 08:01:10 compute-0 nova_compute[191624]: 
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <guest>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <os_type>hvm</os_type>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <arch name='x86_64'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <wordsize>64</wordsize>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <domain type='qemu'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <domain type='kvm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </arch>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <features>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <acpi default='on' toggle='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <apic default='on' toggle='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <cpuselection/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <deviceboot/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <disksnapshot default='on' toggle='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <externalSnapshot/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </features>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </guest>
Oct 02 08:01:10 compute-0 nova_compute[191624]: 
Oct 02 08:01:10 compute-0 nova_compute[191624]: </capabilities>
Oct 02 08:01:10 compute-0 nova_compute[191624]: 
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.512 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.541 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 02 08:01:10 compute-0 nova_compute[191624]: <domainCapabilities>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <path>/usr/libexec/qemu-kvm</path>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <domain>kvm</domain>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <arch>i686</arch>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <vcpu max='240'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <iothreads supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <os supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <enum name='firmware'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <loader supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>rom</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>pflash</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='readonly'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>yes</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>no</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='secure'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>no</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </loader>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </os>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <cpu>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='host-passthrough' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='hostPassthroughMigratable'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>on</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>off</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='maximum' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='maximumMigratable'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>on</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>off</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='host-model' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <vendor>AMD</vendor>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='x2apic'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='tsc-deadline'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='hypervisor'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='tsc_adjust'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='spec-ctrl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='stibp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='arch-capabilities'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='cmp_legacy'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='overflow-recov'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='succor'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='ibrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='amd-ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='virt-ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='lbrv'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='tsc-scale'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='vmcb-clean'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='flushbyasid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='pause-filter'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='pfthreshold'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='svme-addr-chk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='rdctl-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='mds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='pschange-mc-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='gds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='rfds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='disable' name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='custom' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cooperlake'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cooperlake-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cooperlake-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Dhyana-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Genoa'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amd-psfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='auto-ibrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='stibp-always-on'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Genoa-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amd-psfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='auto-ibrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='stibp-always-on'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Milan'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Milan-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Milan-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amd-psfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='stibp-always-on'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='GraniteRapids'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='prefetchiti'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='GraniteRapids-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='prefetchiti'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='GraniteRapids-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10-128'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10-256'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10-512'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='prefetchiti'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v6'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v7'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='KnightsMill'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512er'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512pf'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='KnightsMill-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512er'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512pf'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G4-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tbm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G5-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tbm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SierraForest'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cmpccxadd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SierraForest-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cmpccxadd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='athlon'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='athlon-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='core2duo'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='core2duo-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='coreduo'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='coreduo-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='n270'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='n270-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='phenom'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='phenom-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </cpu>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <memoryBacking supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <enum name='sourceType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>file</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>anonymous</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>memfd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </memoryBacking>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <devices>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <disk supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='diskDevice'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>disk</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>cdrom</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>floppy</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>lun</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='bus'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>ide</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>fdc</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>scsi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>usb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>sata</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-non-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </disk>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <graphics supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vnc</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>egl-headless</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>dbus</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </graphics>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <video supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='modelType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vga</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>cirrus</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>none</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>bochs</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>ramfb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </video>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <hostdev supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='mode'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>subsystem</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='startupPolicy'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>default</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>mandatory</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>requisite</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>optional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='subsysType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>usb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>pci</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>scsi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='capsType'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='pciBackend'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </hostdev>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <rng supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-non-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendModel'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>random</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>egd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>builtin</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </rng>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <filesystem supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='driverType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>path</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>handle</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtiofs</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </filesystem>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <tpm supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>tpm-tis</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>tpm-crb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendModel'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>emulator</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>external</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendVersion'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>2.0</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </tpm>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <redirdev supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='bus'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>usb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </redirdev>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <channel supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>pty</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>unix</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </channel>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <crypto supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>qemu</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendModel'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>builtin</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </crypto>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <interface supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>default</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>passt</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </interface>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <panic supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>isa</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>hyperv</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </panic>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </devices>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <features>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <gic supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <vmcoreinfo supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <genid supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <backingStoreInput supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <backup supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <async-teardown supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <ps2 supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <sev supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <sgx supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <hyperv supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='features'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>relaxed</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vapic</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>spinlocks</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vpindex</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>runtime</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>synic</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>stimer</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>reset</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vendor_id</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>frequencies</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>reenlightenment</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>tlbflush</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>ipi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>avic</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>emsr_bitmap</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>xmm_input</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </hyperv>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <launchSecurity supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </features>
Oct 02 08:01:10 compute-0 nova_compute[191624]: </domainCapabilities>
Oct 02 08:01:10 compute-0 nova_compute[191624]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.553 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 02 08:01:10 compute-0 nova_compute[191624]: <domainCapabilities>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <path>/usr/libexec/qemu-kvm</path>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <domain>kvm</domain>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <arch>i686</arch>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <vcpu max='4096'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <iothreads supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <os supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <enum name='firmware'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <loader supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>rom</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>pflash</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='readonly'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>yes</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>no</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='secure'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>no</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </loader>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </os>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <cpu>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='host-passthrough' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='hostPassthroughMigratable'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>on</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>off</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='maximum' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='maximumMigratable'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>on</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>off</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='host-model' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <vendor>AMD</vendor>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='x2apic'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='tsc-deadline'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='hypervisor'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='tsc_adjust'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='spec-ctrl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='stibp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='arch-capabilities'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='cmp_legacy'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='overflow-recov'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='succor'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='ibrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='amd-ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='virt-ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='lbrv'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='tsc-scale'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='vmcb-clean'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='flushbyasid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='pause-filter'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='pfthreshold'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='svme-addr-chk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='rdctl-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='mds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='pschange-mc-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='gds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='rfds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='disable' name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='custom' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cooperlake'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cooperlake-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cooperlake-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Dhyana-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Genoa'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amd-psfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='auto-ibrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='stibp-always-on'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Genoa-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amd-psfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='auto-ibrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='stibp-always-on'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Milan'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Milan-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Milan-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amd-psfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='stibp-always-on'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='GraniteRapids'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='prefetchiti'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='GraniteRapids-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='prefetchiti'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='GraniteRapids-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10-128'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10-256'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10-512'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='prefetchiti'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v6'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v7'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='KnightsMill'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512er'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512pf'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='KnightsMill-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512er'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512pf'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G4-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tbm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G5-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tbm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SierraForest'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cmpccxadd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SierraForest-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cmpccxadd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='athlon'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='athlon-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='core2duo'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='core2duo-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='coreduo'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='coreduo-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='n270'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='n270-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='phenom'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='phenom-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </cpu>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <memoryBacking supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <enum name='sourceType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>file</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>anonymous</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>memfd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </memoryBacking>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <devices>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <disk supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='diskDevice'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>disk</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>cdrom</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>floppy</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>lun</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='bus'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>fdc</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>scsi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>usb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>sata</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-non-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </disk>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <graphics supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vnc</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>egl-headless</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>dbus</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </graphics>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <video supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='modelType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vga</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>cirrus</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>none</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>bochs</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>ramfb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </video>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <hostdev supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='mode'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>subsystem</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='startupPolicy'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>default</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>mandatory</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>requisite</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>optional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='subsysType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>usb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>pci</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>scsi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='capsType'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='pciBackend'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </hostdev>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <rng supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-non-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendModel'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>random</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>egd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>builtin</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </rng>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <filesystem supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='driverType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>path</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>handle</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtiofs</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </filesystem>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <tpm supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>tpm-tis</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>tpm-crb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendModel'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>emulator</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>external</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendVersion'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>2.0</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </tpm>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <redirdev supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='bus'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>usb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </redirdev>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <channel supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>pty</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>unix</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </channel>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <crypto supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>qemu</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendModel'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>builtin</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </crypto>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <interface supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>default</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>passt</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </interface>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <panic supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>isa</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>hyperv</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </panic>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </devices>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <features>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <gic supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <vmcoreinfo supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <genid supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <backingStoreInput supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <backup supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <async-teardown supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <ps2 supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <sev supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <sgx supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <hyperv supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='features'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>relaxed</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vapic</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>spinlocks</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vpindex</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>runtime</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>synic</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>stimer</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>reset</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vendor_id</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>frequencies</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>reenlightenment</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>tlbflush</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>ipi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>avic</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>emsr_bitmap</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>xmm_input</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </hyperv>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <launchSecurity supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </features>
Oct 02 08:01:10 compute-0 nova_compute[191624]: </domainCapabilities>
Oct 02 08:01:10 compute-0 nova_compute[191624]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.597 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.601 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct 02 08:01:10 compute-0 nova_compute[191624]: <domainCapabilities>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <path>/usr/libexec/qemu-kvm</path>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <domain>kvm</domain>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <arch>x86_64</arch>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <vcpu max='240'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <iothreads supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <os supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <enum name='firmware'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <loader supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>rom</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>pflash</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='readonly'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>yes</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>no</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='secure'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>no</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </loader>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </os>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <cpu>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='host-passthrough' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='hostPassthroughMigratable'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>on</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>off</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='maximum' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='maximumMigratable'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>on</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>off</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='host-model' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <vendor>AMD</vendor>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='x2apic'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='tsc-deadline'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='hypervisor'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='tsc_adjust'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='spec-ctrl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='stibp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='arch-capabilities'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='cmp_legacy'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='overflow-recov'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='succor'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='ibrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='amd-ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='virt-ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='lbrv'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='tsc-scale'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='vmcb-clean'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='flushbyasid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='pause-filter'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='pfthreshold'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='svme-addr-chk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='rdctl-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='mds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='pschange-mc-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='gds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='rfds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='disable' name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='custom' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cooperlake'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cooperlake-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cooperlake-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Dhyana-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Genoa'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amd-psfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='auto-ibrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='stibp-always-on'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Genoa-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amd-psfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='auto-ibrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='stibp-always-on'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Milan'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Milan-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Milan-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amd-psfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='stibp-always-on'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='GraniteRapids'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='prefetchiti'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='GraniteRapids-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='prefetchiti'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='GraniteRapids-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10-128'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10-256'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10-512'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='prefetchiti'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v6'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v7'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='KnightsMill'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512er'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512pf'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='KnightsMill-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512er'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512pf'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G4-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tbm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G5-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tbm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SierraForest'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cmpccxadd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SierraForest-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cmpccxadd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='athlon'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='athlon-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='core2duo'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='core2duo-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='coreduo'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='coreduo-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='n270'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='n270-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='phenom'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='phenom-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </cpu>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <memoryBacking supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <enum name='sourceType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>file</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>anonymous</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>memfd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </memoryBacking>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <devices>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <disk supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='diskDevice'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>disk</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>cdrom</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>floppy</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>lun</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='bus'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>ide</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>fdc</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>scsi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>usb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>sata</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-non-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </disk>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <graphics supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vnc</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>egl-headless</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>dbus</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </graphics>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <video supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='modelType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vga</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>cirrus</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>none</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>bochs</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>ramfb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </video>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <hostdev supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='mode'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>subsystem</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='startupPolicy'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>default</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>mandatory</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>requisite</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>optional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='subsysType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>usb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>pci</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>scsi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='capsType'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='pciBackend'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </hostdev>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <rng supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-non-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendModel'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>random</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>egd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>builtin</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </rng>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <filesystem supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='driverType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>path</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>handle</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtiofs</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </filesystem>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <tpm supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>tpm-tis</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>tpm-crb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendModel'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>emulator</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>external</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendVersion'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>2.0</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </tpm>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <redirdev supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='bus'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>usb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </redirdev>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <channel supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>pty</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>unix</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </channel>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <crypto supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>qemu</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendModel'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>builtin</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </crypto>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <interface supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>default</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>passt</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </interface>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <panic supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>isa</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>hyperv</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </panic>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </devices>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <features>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <gic supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <vmcoreinfo supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <genid supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <backingStoreInput supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <backup supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <async-teardown supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <ps2 supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <sev supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <sgx supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <hyperv supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='features'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>relaxed</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vapic</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>spinlocks</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vpindex</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>runtime</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>synic</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>stimer</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>reset</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vendor_id</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>frequencies</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>reenlightenment</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>tlbflush</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>ipi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>avic</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>emsr_bitmap</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>xmm_input</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </hyperv>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <launchSecurity supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </features>
Oct 02 08:01:10 compute-0 nova_compute[191624]: </domainCapabilities>
Oct 02 08:01:10 compute-0 nova_compute[191624]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.664 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 02 08:01:10 compute-0 nova_compute[191624]: <domainCapabilities>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <path>/usr/libexec/qemu-kvm</path>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <domain>kvm</domain>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <arch>x86_64</arch>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <vcpu max='4096'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <iothreads supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <os supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <enum name='firmware'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>efi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <loader supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>rom</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>pflash</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='readonly'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>yes</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>no</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='secure'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>yes</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>no</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </loader>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </os>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <cpu>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='host-passthrough' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='hostPassthroughMigratable'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>on</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>off</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='maximum' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='maximumMigratable'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>on</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>off</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='host-model' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <vendor>AMD</vendor>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='x2apic'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='tsc-deadline'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='hypervisor'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='tsc_adjust'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='spec-ctrl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='stibp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='arch-capabilities'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='cmp_legacy'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='overflow-recov'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='succor'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='ibrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='amd-ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='virt-ssbd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='lbrv'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='tsc-scale'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='vmcb-clean'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='flushbyasid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='pause-filter'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='pfthreshold'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='svme-addr-chk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='rdctl-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='mds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='pschange-mc-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='gds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='require' name='rfds-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <feature policy='disable' name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <mode name='custom' supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Broadwell-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cascadelake-Server-v5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cooperlake'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cooperlake-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Cooperlake-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Denverton-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Dhyana-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Genoa'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amd-psfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='auto-ibrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='stibp-always-on'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Genoa-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amd-psfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='auto-ibrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='stibp-always-on'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Milan'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Milan-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Milan-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amd-psfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='stibp-always-on'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-Rome-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='EPYC-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='GraniteRapids'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='prefetchiti'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='GraniteRapids-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='prefetchiti'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='GraniteRapids-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10-128'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10-256'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx10-512'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='prefetchiti'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Haswell-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-noTSX'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v6'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Icelake-Server-v7'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='IvyBridge-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='KnightsMill'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512er'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512pf'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='KnightsMill-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512er'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512pf'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G4-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tbm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Opteron_G5-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fma4'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tbm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xop'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SapphireRapids-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='amx-tile'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-bf16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-fp16'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bitalg'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrc'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fzrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='la57'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='taa-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xfd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SierraForest'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cmpccxadd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='SierraForest-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ifma'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cmpccxadd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fbsdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='fsrs'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ibrs-all'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mcdt-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pbrsb-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='psdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='serialize'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vaes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Client-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='hle'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='rtm'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Skylake-Server-v5'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512bw'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512cd'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512dq'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512f'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='avx512vl'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='invpcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pcid'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='pku'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='mpx'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v2'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v3'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='core-capability'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='split-lock-detect'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='Snowridge-v4'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='cldemote'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='erms'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='gfni'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdir64b'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='movdiri'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='xsaves'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='athlon'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='athlon-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='core2duo'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='core2duo-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='coreduo'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='coreduo-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='n270'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='n270-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='ss'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='phenom'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <blockers model='phenom-v1'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnow'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <feature name='3dnowext'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </blockers>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </mode>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </cpu>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <memoryBacking supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <enum name='sourceType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>file</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>anonymous</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <value>memfd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </memoryBacking>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <devices>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <disk supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='diskDevice'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>disk</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>cdrom</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>floppy</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>lun</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='bus'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>fdc</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>scsi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>usb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>sata</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-non-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </disk>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <graphics supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vnc</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>egl-headless</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>dbus</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </graphics>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <video supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='modelType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vga</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>cirrus</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>none</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>bochs</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>ramfb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </video>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <hostdev supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='mode'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>subsystem</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='startupPolicy'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>default</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>mandatory</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>requisite</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>optional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='subsysType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>usb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>pci</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>scsi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='capsType'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='pciBackend'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </hostdev>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <rng supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtio-non-transitional</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendModel'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>random</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>egd</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>builtin</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </rng>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <filesystem supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='driverType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>path</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>handle</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>virtiofs</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </filesystem>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <tpm supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>tpm-tis</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>tpm-crb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendModel'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>emulator</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>external</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendVersion'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>2.0</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </tpm>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <redirdev supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='bus'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>usb</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </redirdev>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <channel supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>pty</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>unix</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </channel>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <crypto supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='type'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>qemu</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendModel'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>builtin</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </crypto>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <interface supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='backendType'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>default</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>passt</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </interface>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <panic supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='model'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>isa</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>hyperv</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </panic>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </devices>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   <features>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <gic supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <vmcoreinfo supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <genid supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <backingStoreInput supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <backup supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <async-teardown supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <ps2 supported='yes'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <sev supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <sgx supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <hyperv supported='yes'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       <enum name='features'>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>relaxed</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vapic</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>spinlocks</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vpindex</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>runtime</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>synic</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>stimer</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>reset</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>vendor_id</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>frequencies</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>reenlightenment</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>tlbflush</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>ipi</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>avic</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>emsr_bitmap</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:         <value>xmm_input</value>
Oct 02 08:01:10 compute-0 nova_compute[191624]:       </enum>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     </hyperv>
Oct 02 08:01:10 compute-0 nova_compute[191624]:     <launchSecurity supported='no'/>
Oct 02 08:01:10 compute-0 nova_compute[191624]:   </features>
Oct 02 08:01:10 compute-0 nova_compute[191624]: </domainCapabilities>
Oct 02 08:01:10 compute-0 nova_compute[191624]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.718 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.718 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.718 2 DEBUG nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.718 2 INFO nova.virt.libvirt.host [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Secure Boot support detected
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.720 2 INFO nova.virt.libvirt.driver [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.720 2 INFO nova.virt.libvirt.driver [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.729 2 DEBUG nova.virt.libvirt.driver [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.774 2 INFO nova.virt.node [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Determined node identity e7f6698e-de2d-4705-8493-a3445ce0cf6e from /var/lib/nova/compute_id
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.792 2 WARNING nova.compute.manager [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Compute nodes ['e7f6698e-de2d-4705-8493-a3445ce0cf6e'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.832 2 INFO nova.compute.manager [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.868 2 WARNING nova.compute.manager [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.868 2 DEBUG oslo_concurrency.lockutils [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.869 2 DEBUG oslo_concurrency.lockutils [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.869 2 DEBUG oslo_concurrency.lockutils [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:01:10 compute-0 nova_compute[191624]: 2025-10-02 08:01:10.869 2 DEBUG nova.compute.resource_tracker [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:01:10 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Oct 02 08:01:10 compute-0 systemd[1]: Started libvirt nodedev daemon.
Oct 02 08:01:11 compute-0 sudo[192502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enxmswdxilduydxyywenuehrfdnjpzwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392070.8322568-3521-200691825420096/AnsiballZ_systemd.py'
Oct 02 08:01:11 compute-0 sudo[192502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:11 compute-0 nova_compute[191624]: 2025-10-02 08:01:11.176 2 WARNING nova.virt.libvirt.driver [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:01:11 compute-0 nova_compute[191624]: 2025-10-02 08:01:11.178 2 DEBUG nova.compute.resource_tracker [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6172MB free_disk=73.67152404785156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:01:11 compute-0 nova_compute[191624]: 2025-10-02 08:01:11.178 2 DEBUG oslo_concurrency.lockutils [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:01:11 compute-0 nova_compute[191624]: 2025-10-02 08:01:11.178 2 DEBUG oslo_concurrency.lockutils [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:01:11 compute-0 nova_compute[191624]: 2025-10-02 08:01:11.192 2 WARNING nova.compute.resource_tracker [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] No compute node record for compute-0.ctlplane.example.com:e7f6698e-de2d-4705-8493-a3445ce0cf6e: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host e7f6698e-de2d-4705-8493-a3445ce0cf6e could not be found.
Oct 02 08:01:11 compute-0 nova_compute[191624]: 2025-10-02 08:01:11.215 2 INFO nova.compute.resource_tracker [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: e7f6698e-de2d-4705-8493-a3445ce0cf6e
Oct 02 08:01:11 compute-0 nova_compute[191624]: 2025-10-02 08:01:11.298 2 DEBUG nova.compute.resource_tracker [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:01:11 compute-0 nova_compute[191624]: 2025-10-02 08:01:11.299 2 DEBUG nova.compute.resource_tracker [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:01:11 compute-0 python3.9[192504]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 08:01:11 compute-0 systemd[1]: Stopping nova_compute container...
Oct 02 08:01:11 compute-0 nova_compute[191624]: 2025-10-02 08:01:11.558 2 DEBUG oslo_concurrency.lockutils [None req-31b00215-2475-41cd-b5b3-3429a64c415d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.380s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:01:11 compute-0 nova_compute[191624]: 2025-10-02 08:01:11.559 2 DEBUG oslo_concurrency.lockutils [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:01:11 compute-0 nova_compute[191624]: 2025-10-02 08:01:11.559 2 DEBUG oslo_concurrency.lockutils [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:01:11 compute-0 nova_compute[191624]: 2025-10-02 08:01:11.560 2 DEBUG oslo_concurrency.lockutils [None req-13e58f01-10b2-4560-9a7b-d8199a84d7ca - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:01:11 compute-0 virtqemud[192112]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct 02 08:01:11 compute-0 virtqemud[192112]: hostname: compute-0
Oct 02 08:01:11 compute-0 virtqemud[192112]: End of file while reading data: Input/output error
Oct 02 08:01:11 compute-0 systemd[1]: libpod-efbb211a2f0d2c50bafd3b3e9ede52cf80edfd73757aa793dad9d7e9e4ad46a7.scope: Deactivated successfully.
Oct 02 08:01:11 compute-0 systemd[1]: libpod-efbb211a2f0d2c50bafd3b3e9ede52cf80edfd73757aa793dad9d7e9e4ad46a7.scope: Consumed 3.124s CPU time.
Oct 02 08:01:11 compute-0 podman[192508]: 2025-10-02 08:01:11.959003297 +0000 UTC m=+0.458211100 container died efbb211a2f0d2c50bafd3b3e9ede52cf80edfd73757aa793dad9d7e9e4ad46a7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:01:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-efbb211a2f0d2c50bafd3b3e9ede52cf80edfd73757aa793dad9d7e9e4ad46a7-userdata-shm.mount: Deactivated successfully.
Oct 02 08:01:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-27c9f28386cafe76f5075f6a8a2e6f8f49527a9107faa3c67922313d8ff76e0f-merged.mount: Deactivated successfully.
Oct 02 08:01:12 compute-0 podman[192508]: 2025-10-02 08:01:12.046126661 +0000 UTC m=+0.545334464 container cleanup efbb211a2f0d2c50bafd3b3e9ede52cf80edfd73757aa793dad9d7e9e4ad46a7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 02 08:01:12 compute-0 podman[192508]: nova_compute
Oct 02 08:01:12 compute-0 podman[192539]: nova_compute
Oct 02 08:01:12 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct 02 08:01:12 compute-0 systemd[1]: Stopped nova_compute container.
Oct 02 08:01:12 compute-0 systemd[1]: Starting nova_compute container...
Oct 02 08:01:12 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:01:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c9f28386cafe76f5075f6a8a2e6f8f49527a9107faa3c67922313d8ff76e0f/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c9f28386cafe76f5075f6a8a2e6f8f49527a9107faa3c67922313d8ff76e0f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c9f28386cafe76f5075f6a8a2e6f8f49527a9107faa3c67922313d8ff76e0f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c9f28386cafe76f5075f6a8a2e6f8f49527a9107faa3c67922313d8ff76e0f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c9f28386cafe76f5075f6a8a2e6f8f49527a9107faa3c67922313d8ff76e0f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:12 compute-0 podman[192552]: 2025-10-02 08:01:12.266659878 +0000 UTC m=+0.119845726 container init efbb211a2f0d2c50bafd3b3e9ede52cf80edfd73757aa793dad9d7e9e4ad46a7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:01:12 compute-0 podman[192552]: 2025-10-02 08:01:12.280314988 +0000 UTC m=+0.133500786 container start efbb211a2f0d2c50bafd3b3e9ede52cf80edfd73757aa793dad9d7e9e4ad46a7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=edpm)
Oct 02 08:01:12 compute-0 podman[192552]: nova_compute
Oct 02 08:01:12 compute-0 nova_compute[192567]: + sudo -E kolla_set_configs
Oct 02 08:01:12 compute-0 systemd[1]: Started nova_compute container.
Oct 02 08:01:12 compute-0 sudo[192502]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Validating config file
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Copying service configuration files
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Deleting /etc/ceph
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Creating directory /etc/ceph
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Setting permission for /etc/ceph
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Writing out command to execute
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 02 08:01:12 compute-0 nova_compute[192567]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 02 08:01:12 compute-0 nova_compute[192567]: ++ cat /run_command
Oct 02 08:01:12 compute-0 nova_compute[192567]: + CMD=nova-compute
Oct 02 08:01:12 compute-0 nova_compute[192567]: + ARGS=
Oct 02 08:01:12 compute-0 nova_compute[192567]: + sudo kolla_copy_cacerts
Oct 02 08:01:12 compute-0 nova_compute[192567]: + [[ ! -n '' ]]
Oct 02 08:01:12 compute-0 nova_compute[192567]: + . kolla_extend_start
Oct 02 08:01:12 compute-0 nova_compute[192567]: + echo 'Running command: '\''nova-compute'\'''
Oct 02 08:01:12 compute-0 nova_compute[192567]: Running command: 'nova-compute'
Oct 02 08:01:12 compute-0 nova_compute[192567]: + umask 0022
Oct 02 08:01:12 compute-0 nova_compute[192567]: + exec nova-compute
Oct 02 08:01:13 compute-0 sudo[192728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rromjyytxgjjentoiskhqhqroitteddz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392072.708146-3539-220719495399202/AnsiballZ_podman_container.py'
Oct 02 08:01:13 compute-0 sudo[192728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:13 compute-0 python3.9[192730]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 02 08:01:13 compute-0 systemd[1]: Started libpod-conmon-05e1d1c1ed5c4659c691b7799e217a71735860f200a1892da2fcfaa87f3417ce.scope.
Oct 02 08:01:13 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:01:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/535e4baaf62b50e3118af5aa71a716dd6dbf2f82b25d4cd9bb87dceb6f41a9d6/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/535e4baaf62b50e3118af5aa71a716dd6dbf2f82b25d4cd9bb87dceb6f41a9d6/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/535e4baaf62b50e3118af5aa71a716dd6dbf2f82b25d4cd9bb87dceb6f41a9d6/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct 02 08:01:13 compute-0 podman[192756]: 2025-10-02 08:01:13.640347158 +0000 UTC m=+0.165786556 container init 05e1d1c1ed5c4659c691b7799e217a71735860f200a1892da2fcfaa87f3417ce (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0)
Oct 02 08:01:13 compute-0 podman[192756]: 2025-10-02 08:01:13.652446771 +0000 UTC m=+0.177886139 container start 05e1d1c1ed5c4659c691b7799e217a71735860f200a1892da2fcfaa87f3417ce (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=edpm, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:01:13 compute-0 python3.9[192730]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Applying nova statedir ownership
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct 02 08:01:13 compute-0 nova_compute_init[192777]: INFO:nova_statedir:Nova statedir ownership complete
Oct 02 08:01:13 compute-0 systemd[1]: libpod-05e1d1c1ed5c4659c691b7799e217a71735860f200a1892da2fcfaa87f3417ce.scope: Deactivated successfully.
Oct 02 08:01:13 compute-0 podman[192789]: 2025-10-02 08:01:13.795833864 +0000 UTC m=+0.039617310 container died 05e1d1c1ed5c4659c691b7799e217a71735860f200a1892da2fcfaa87f3417ce (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=edpm)
Oct 02 08:01:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-05e1d1c1ed5c4659c691b7799e217a71735860f200a1892da2fcfaa87f3417ce-userdata-shm.mount: Deactivated successfully.
Oct 02 08:01:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-535e4baaf62b50e3118af5aa71a716dd6dbf2f82b25d4cd9bb87dceb6f41a9d6-merged.mount: Deactivated successfully.
Oct 02 08:01:13 compute-0 podman[192789]: 2025-10-02 08:01:13.846786542 +0000 UTC m=+0.090569928 container cleanup 05e1d1c1ed5c4659c691b7799e217a71735860f200a1892da2fcfaa87f3417ce (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.schema-version=1.0, container_name=nova_compute_init, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:01:13 compute-0 systemd[1]: libpod-conmon-05e1d1c1ed5c4659c691b7799e217a71735860f200a1892da2fcfaa87f3417ce.scope: Deactivated successfully.
Oct 02 08:01:13 compute-0 sudo[192728]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:14 compute-0 sshd-session[158136]: Connection closed by 192.168.122.30 port 43794
Oct 02 08:01:14 compute-0 sshd-session[158133]: pam_unix(sshd:session): session closed for user zuul
Oct 02 08:01:14 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Oct 02 08:01:14 compute-0 systemd[1]: session-25.scope: Consumed 2min 43.777s CPU time.
Oct 02 08:01:14 compute-0 systemd-logind[827]: Session 25 logged out. Waiting for processes to exit.
Oct 02 08:01:14 compute-0 systemd-logind[827]: Removed session 25.
Oct 02 08:01:14 compute-0 nova_compute[192567]: 2025-10-02 08:01:14.452 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 02 08:01:14 compute-0 nova_compute[192567]: 2025-10-02 08:01:14.452 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 02 08:01:14 compute-0 nova_compute[192567]: 2025-10-02 08:01:14.453 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 02 08:01:14 compute-0 nova_compute[192567]: 2025-10-02 08:01:14.453 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 02 08:01:14 compute-0 nova_compute[192567]: 2025-10-02 08:01:14.582 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:01:14 compute-0 nova_compute[192567]: 2025-10-02 08:01:14.614 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.043 2 INFO nova.virt.driver [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.170 2 INFO nova.compute.provider_config [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.191 2 DEBUG oslo_concurrency.lockutils [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.192 2 DEBUG oslo_concurrency.lockutils [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.192 2 DEBUG oslo_concurrency.lockutils [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.193 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.193 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.193 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.194 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.194 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.194 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.195 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.195 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.195 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.195 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.196 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.196 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.196 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.196 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.197 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.197 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.197 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.198 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.198 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.198 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.198 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.199 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.199 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.199 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.200 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.200 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.200 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.201 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.201 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.201 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.201 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.202 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.202 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.202 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.203 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.203 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.203 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.203 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.204 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.204 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.204 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.205 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.205 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.205 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.206 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.206 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.206 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.207 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.207 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.207 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.207 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.208 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.208 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.208 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.209 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.209 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.209 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.209 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.210 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.210 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.210 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.211 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.211 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.211 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.211 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.212 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.212 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.212 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.212 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.213 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.213 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.213 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.214 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.214 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.214 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.214 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.215 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.215 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.215 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.216 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.216 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.216 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.217 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.217 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.217 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.217 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.218 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.218 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.218 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.218 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.218 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.219 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.219 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.219 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.219 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.219 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.219 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.220 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.220 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.220 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.220 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.220 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.221 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.221 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.221 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.221 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.222 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.222 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.222 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.223 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.223 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.224 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.224 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.224 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.225 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.225 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.225 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.226 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.226 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.226 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.226 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.226 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.226 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.226 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.227 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.227 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.227 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.227 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.227 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.227 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.227 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.228 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.228 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.228 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.228 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.228 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.228 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.228 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.229 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.229 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.229 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.229 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.229 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.230 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.230 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.230 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.230 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.230 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.230 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.230 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.231 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.231 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.231 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.231 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.231 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.231 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.232 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.232 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.232 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.232 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.232 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.232 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.232 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.233 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.233 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.233 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.233 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.233 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.233 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.234 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.234 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.234 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.234 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.234 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.234 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.234 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.235 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.235 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.235 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.235 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.235 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.235 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.236 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.236 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.236 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.236 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.236 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.236 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.236 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.237 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.237 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.237 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.237 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.237 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.237 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.237 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.238 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.238 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.238 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.238 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.238 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.238 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.239 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.239 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.239 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.239 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.239 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.239 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.239 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.240 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.240 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.240 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.240 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.240 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.240 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.240 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.241 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.241 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.241 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.241 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.241 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.241 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.242 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.242 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.242 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.242 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.242 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.242 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.242 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.243 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.243 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.243 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.243 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.243 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.243 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.243 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.244 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.244 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.244 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.244 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.244 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.244 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.245 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.245 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.245 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.245 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.245 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.245 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.245 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.246 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.246 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.246 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.246 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.246 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.246 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.247 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.247 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.247 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.247 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.247 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.247 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.247 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.248 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.248 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.248 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.248 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.248 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.248 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.249 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.249 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.249 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.249 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.249 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.249 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.250 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.250 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.250 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.250 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.250 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.250 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.250 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.251 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.251 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.251 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.251 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.251 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.251 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.251 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.252 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.252 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.252 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.252 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.252 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.252 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.252 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.253 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.253 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.253 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.253 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.253 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.253 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.254 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.254 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.254 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.254 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.254 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.254 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.254 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.255 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.255 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.255 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.255 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.255 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.255 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.255 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.256 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.256 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.256 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.256 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.256 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.256 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.257 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.257 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.257 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.257 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.257 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.257 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.258 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.258 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.258 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.258 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.258 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.258 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.258 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.259 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.259 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.259 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.259 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.259 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.259 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.259 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.260 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.260 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.260 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.260 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.260 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.260 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.260 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.261 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.261 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.261 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.261 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.261 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.261 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.261 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.262 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.262 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.262 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.262 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.262 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.263 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.263 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.263 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.263 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.263 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.263 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.263 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.264 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.264 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.264 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.264 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.264 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.264 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.264 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.265 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.265 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.265 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.265 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.265 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.265 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.266 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.266 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.266 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.266 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.266 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.266 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.266 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.267 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.267 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.267 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.267 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.267 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.267 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.268 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.268 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.268 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.268 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.268 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.268 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.268 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.269 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.269 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.269 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.269 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.269 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.270 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.270 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.270 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.270 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.270 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.270 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.271 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.271 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.271 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.271 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.271 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.271 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.272 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.272 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.272 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.272 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.272 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.272 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.273 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.273 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.273 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.273 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.273 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.273 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.274 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.274 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.274 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.274 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.274 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.274 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.274 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.275 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.275 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.275 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.275 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.275 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.275 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.276 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.276 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.276 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.276 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.276 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.276 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.277 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.277 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.277 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.277 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.277 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.277 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.277 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.278 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.278 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.278 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.278 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.278 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.278 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.278 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.279 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.279 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.279 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.279 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.279 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.279 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.280 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.280 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.280 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.280 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.280 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.280 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.280 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.281 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.281 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.281 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.281 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.281 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.281 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.282 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.282 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.282 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.282 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.282 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.282 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.282 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.283 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.283 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.283 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.283 2 WARNING oslo_config.cfg [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 02 08:01:15 compute-0 nova_compute[192567]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 02 08:01:15 compute-0 nova_compute[192567]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 02 08:01:15 compute-0 nova_compute[192567]: and ``live_migration_inbound_addr`` respectively.
Oct 02 08:01:15 compute-0 nova_compute[192567]: ).  Its value may be silently ignored in the future.
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.283 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.283 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.284 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.284 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.284 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.284 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.284 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.284 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.285 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.285 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.285 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.285 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.285 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.285 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.285 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.286 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.286 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.286 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.286 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.286 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.286 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.287 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.287 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.287 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.287 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.287 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.287 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.287 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.288 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.288 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.288 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.288 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.288 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.288 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.288 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.289 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.289 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.289 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.289 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.289 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.289 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.290 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.290 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.290 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.290 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.290 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.290 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.290 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.291 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.291 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.291 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.291 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.291 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.291 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.291 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.292 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.292 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.292 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.292 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.292 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.292 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.292 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.293 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.293 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.293 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.293 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.293 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.293 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.293 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.294 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.294 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.294 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.294 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.294 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.294 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.294 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.295 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.295 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.295 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.295 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.295 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.295 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.295 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.296 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.296 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.296 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.296 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.296 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.297 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.297 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.297 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.297 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.297 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.297 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.298 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.298 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.298 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.298 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.298 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.299 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.299 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.299 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.299 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.299 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.299 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.299 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.300 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.300 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.300 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.300 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.300 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.300 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.300 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.301 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.301 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.301 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.301 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.301 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.302 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.302 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.302 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.302 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.302 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.303 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.303 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.303 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.303 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.303 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.304 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.304 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.304 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.304 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.304 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.304 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.305 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.305 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.305 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.305 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.305 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.306 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.306 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.306 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.306 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.306 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.306 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.307 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.307 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.307 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.307 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.307 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.307 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.307 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.308 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.308 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.308 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.308 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.308 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.309 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.309 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.309 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.309 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.309 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.309 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.309 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.310 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.310 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.310 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.310 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.310 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.310 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.310 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.311 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.311 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.311 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.311 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.311 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.311 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.311 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.312 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.312 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.312 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.312 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.312 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.312 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.313 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.313 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.313 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.313 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.313 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.313 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.313 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.314 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.314 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.314 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.314 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.314 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.314 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.315 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.315 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.315 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.315 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.315 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.316 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.316 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.316 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.316 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.316 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.316 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.317 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.317 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.317 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.317 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.317 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.317 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.318 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.318 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.318 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.318 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.318 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.318 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.318 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.319 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.319 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.319 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.319 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.319 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.319 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.319 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.320 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.320 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.320 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.320 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.320 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.320 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.320 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.321 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.321 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.321 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.321 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.321 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.321 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.321 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.322 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.322 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.322 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.322 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.322 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.322 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.323 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.323 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.323 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.323 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.323 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.323 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.324 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.324 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.324 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.324 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.324 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.324 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.324 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.325 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.325 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.325 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.325 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.325 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.325 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.325 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.326 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.326 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.326 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.326 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.326 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.326 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.326 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.327 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.327 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.327 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.327 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.327 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.327 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.327 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.328 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.328 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.328 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.328 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.328 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.328 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.328 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.329 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.329 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.329 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.329 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.329 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.330 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.330 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.330 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.330 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.330 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.330 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.331 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.331 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.331 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.331 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.331 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.331 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.331 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.332 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.332 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.332 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.332 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.332 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.332 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.332 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.333 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.333 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.333 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.333 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.333 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.333 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.334 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.334 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.334 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.334 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.334 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.334 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.334 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.335 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.335 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.335 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.335 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.335 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.335 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.335 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.336 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.336 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.336 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.336 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.336 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.336 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.337 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.337 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.337 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.337 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.337 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.337 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.337 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.338 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.338 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.338 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.338 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.338 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.338 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.338 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.339 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.339 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.339 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.339 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.339 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.339 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.339 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.339 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.340 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.340 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.340 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.340 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.340 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.340 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.341 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.341 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.341 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.341 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.341 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.341 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.341 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.342 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.342 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.342 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.342 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.342 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.342 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.342 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.343 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.343 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.343 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.343 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.343 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.343 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.343 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.343 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.344 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.344 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.344 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.344 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.344 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.344 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.344 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.345 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.345 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.345 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.345 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.345 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.345 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.346 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.346 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.346 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.346 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.346 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.346 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.346 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.347 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.347 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.347 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.347 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.347 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.347 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.347 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.348 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.348 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.348 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.348 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.348 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.348 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.348 2 DEBUG oslo_service.service [None req-b399bdc9-98bc-4361-acda-0fbece13646d - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.350 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.378 2 INFO nova.virt.node [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Determined node identity e7f6698e-de2d-4705-8493-a3445ce0cf6e from /var/lib/nova/compute_id
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.379 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.380 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.380 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.380 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.398 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f91790b3af0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.402 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f91790b3af0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.403 2 INFO nova.virt.libvirt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Connection event '1' reason 'None'
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.415 2 INFO nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Libvirt host capabilities <capabilities>
Oct 02 08:01:15 compute-0 nova_compute[192567]: 
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <host>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <uuid>b127680e-a52a-46b0-96f2-3ca4a3f61658</uuid>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <cpu>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <arch>x86_64</arch>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model>EPYC-Rome-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <vendor>AMD</vendor>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <microcode version='16777317'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <signature family='23' model='49' stepping='0'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <maxphysaddr mode='emulate' bits='40'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='x2apic'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='tsc-deadline'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='osxsave'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='hypervisor'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='tsc_adjust'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='spec-ctrl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='stibp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='arch-capabilities'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='cmp_legacy'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='topoext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='virt-ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='lbrv'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='tsc-scale'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='vmcb-clean'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='pause-filter'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='pfthreshold'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='svme-addr-chk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='rdctl-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='skip-l1dfl-vmentry'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='mds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature name='pschange-mc-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <pages unit='KiB' size='4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <pages unit='KiB' size='2048'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <pages unit='KiB' size='1048576'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </cpu>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <power_management>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <suspend_mem/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <suspend_disk/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <suspend_hybrid/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </power_management>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <iommu support='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <migration_features>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <live/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <uri_transports>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <uri_transport>tcp</uri_transport>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <uri_transport>rdma</uri_transport>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </uri_transports>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </migration_features>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <topology>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <cells num='1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <cell id='0'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:           <memory unit='KiB'>7864104</memory>
Oct 02 08:01:15 compute-0 nova_compute[192567]:           <pages unit='KiB' size='4'>1966026</pages>
Oct 02 08:01:15 compute-0 nova_compute[192567]:           <pages unit='KiB' size='2048'>0</pages>
Oct 02 08:01:15 compute-0 nova_compute[192567]:           <pages unit='KiB' size='1048576'>0</pages>
Oct 02 08:01:15 compute-0 nova_compute[192567]:           <distances>
Oct 02 08:01:15 compute-0 nova_compute[192567]:             <sibling id='0' value='10'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:           </distances>
Oct 02 08:01:15 compute-0 nova_compute[192567]:           <cpus num='8'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:           </cpus>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         </cell>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </cells>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </topology>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <cache>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </cache>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <secmodel>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model>selinux</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <doi>0</doi>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </secmodel>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <secmodel>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model>dac</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <doi>0</doi>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <baselabel type='kvm'>+107:+107</baselabel>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <baselabel type='qemu'>+107:+107</baselabel>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </secmodel>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </host>
Oct 02 08:01:15 compute-0 nova_compute[192567]: 
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <guest>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <os_type>hvm</os_type>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <arch name='i686'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <wordsize>32</wordsize>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <domain type='qemu'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <domain type='kvm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </arch>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <features>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <pae/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <nonpae/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <acpi default='on' toggle='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <apic default='on' toggle='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <cpuselection/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <deviceboot/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <disksnapshot default='on' toggle='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <externalSnapshot/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </features>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </guest>
Oct 02 08:01:15 compute-0 nova_compute[192567]: 
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <guest>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <os_type>hvm</os_type>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <arch name='x86_64'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <wordsize>64</wordsize>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <domain type='qemu'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <domain type='kvm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </arch>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <features>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <acpi default='on' toggle='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <apic default='on' toggle='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <cpuselection/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <deviceboot/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <disksnapshot default='on' toggle='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <externalSnapshot/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </features>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </guest>
Oct 02 08:01:15 compute-0 nova_compute[192567]: 
Oct 02 08:01:15 compute-0 nova_compute[192567]: </capabilities>
Oct 02 08:01:15 compute-0 nova_compute[192567]: 
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.424 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.439 2 DEBUG nova.virt.libvirt.volume.mount [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.444 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 02 08:01:15 compute-0 nova_compute[192567]: <domainCapabilities>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <path>/usr/libexec/qemu-kvm</path>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <domain>kvm</domain>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <arch>i686</arch>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <vcpu max='240'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <iothreads supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <os supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <enum name='firmware'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <loader supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>rom</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>pflash</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='readonly'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>yes</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>no</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='secure'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>no</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </loader>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </os>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <cpu>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='host-passthrough' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='hostPassthroughMigratable'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>on</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>off</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='maximum' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='maximumMigratable'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>on</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>off</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='host-model' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <vendor>AMD</vendor>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='x2apic'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='tsc-deadline'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='hypervisor'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='tsc_adjust'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='spec-ctrl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='stibp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='arch-capabilities'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='cmp_legacy'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='overflow-recov'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='succor'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='ibrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='amd-ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='virt-ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='lbrv'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='tsc-scale'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='vmcb-clean'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='flushbyasid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='pause-filter'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='pfthreshold'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='svme-addr-chk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='rdctl-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='mds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='pschange-mc-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='gds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='rfds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='disable' name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='custom' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cooperlake'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cooperlake-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cooperlake-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Dhyana-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Genoa'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amd-psfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='auto-ibrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='stibp-always-on'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Genoa-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amd-psfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='auto-ibrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='stibp-always-on'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Milan'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Milan-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Milan-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amd-psfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='stibp-always-on'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='GraniteRapids'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='prefetchiti'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='GraniteRapids-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='prefetchiti'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='GraniteRapids-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10-128'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10-256'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10-512'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='prefetchiti'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v6'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v7'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='KnightsMill'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512er'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512pf'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='KnightsMill-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512er'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512pf'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G4-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tbm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G5-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tbm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SierraForest'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cmpccxadd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SierraForest-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cmpccxadd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='athlon'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='athlon-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='core2duo'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='core2duo-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='coreduo'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='coreduo-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='n270'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='n270-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='phenom'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='phenom-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <memoryBacking supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <enum name='sourceType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>file</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>anonymous</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>memfd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </memoryBacking>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <disk supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='diskDevice'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>disk</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>cdrom</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>floppy</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>lun</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='bus'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>ide</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>fdc</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>scsi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>usb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>sata</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-non-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <graphics supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vnc</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>egl-headless</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>dbus</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </graphics>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <video supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='modelType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vga</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>cirrus</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>none</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>bochs</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>ramfb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </video>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <hostdev supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='mode'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>subsystem</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='startupPolicy'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>default</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>mandatory</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>requisite</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>optional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='subsysType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>usb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>pci</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>scsi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='capsType'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='pciBackend'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </hostdev>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <rng supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-non-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendModel'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>random</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>egd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>builtin</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <filesystem supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='driverType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>path</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>handle</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtiofs</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </filesystem>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <tpm supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>tpm-tis</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>tpm-crb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendModel'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>emulator</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>external</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendVersion'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>2.0</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </tpm>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <redirdev supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='bus'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>usb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </redirdev>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <channel supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>pty</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>unix</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </channel>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <crypto supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>qemu</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendModel'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>builtin</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </crypto>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <interface supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>default</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>passt</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <panic supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>isa</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>hyperv</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </panic>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <features>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <gic supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <vmcoreinfo supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <genid supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <backingStoreInput supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <backup supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <async-teardown supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <ps2 supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <sev supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <sgx supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <hyperv supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='features'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>relaxed</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vapic</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>spinlocks</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vpindex</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>runtime</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>synic</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>stimer</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>reset</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vendor_id</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>frequencies</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>reenlightenment</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>tlbflush</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>ipi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>avic</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>emsr_bitmap</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>xmm_input</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </hyperv>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <launchSecurity supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </features>
Oct 02 08:01:15 compute-0 nova_compute[192567]: </domainCapabilities>
Oct 02 08:01:15 compute-0 nova_compute[192567]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.455 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 02 08:01:15 compute-0 nova_compute[192567]: <domainCapabilities>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <path>/usr/libexec/qemu-kvm</path>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <domain>kvm</domain>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <arch>i686</arch>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <vcpu max='4096'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <iothreads supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <os supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <enum name='firmware'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <loader supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>rom</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>pflash</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='readonly'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>yes</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>no</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='secure'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>no</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </loader>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </os>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <cpu>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='host-passthrough' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='hostPassthroughMigratable'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>on</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>off</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='maximum' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='maximumMigratable'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>on</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>off</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='host-model' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <vendor>AMD</vendor>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='x2apic'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='tsc-deadline'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='hypervisor'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='tsc_adjust'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='spec-ctrl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='stibp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='arch-capabilities'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='cmp_legacy'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='overflow-recov'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='succor'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='ibrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='amd-ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='virt-ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='lbrv'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='tsc-scale'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='vmcb-clean'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='flushbyasid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='pause-filter'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='pfthreshold'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='svme-addr-chk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='rdctl-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='mds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='pschange-mc-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='gds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='rfds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='disable' name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='custom' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cooperlake'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cooperlake-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cooperlake-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Dhyana-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Genoa'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amd-psfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='auto-ibrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='stibp-always-on'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Genoa-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amd-psfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='auto-ibrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='stibp-always-on'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Milan'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Milan-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Milan-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amd-psfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='stibp-always-on'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='GraniteRapids'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='prefetchiti'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='GraniteRapids-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='prefetchiti'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='GraniteRapids-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10-128'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10-256'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10-512'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='prefetchiti'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v6'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v7'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='KnightsMill'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512er'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512pf'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='KnightsMill-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512er'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512pf'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G4-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tbm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G5-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tbm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SierraForest'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cmpccxadd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SierraForest-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cmpccxadd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='athlon'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='athlon-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='core2duo'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='core2duo-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='coreduo'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='coreduo-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='n270'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='n270-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='phenom'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='phenom-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <memoryBacking supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <enum name='sourceType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>file</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>anonymous</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>memfd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </memoryBacking>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <disk supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='diskDevice'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>disk</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>cdrom</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>floppy</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>lun</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='bus'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>fdc</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>scsi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>usb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>sata</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-non-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <graphics supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vnc</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>egl-headless</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>dbus</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </graphics>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <video supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='modelType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vga</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>cirrus</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>none</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>bochs</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>ramfb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </video>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <hostdev supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='mode'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>subsystem</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='startupPolicy'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>default</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>mandatory</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>requisite</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>optional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='subsysType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>usb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>pci</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>scsi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='capsType'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='pciBackend'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </hostdev>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <rng supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-non-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendModel'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>random</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>egd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>builtin</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <filesystem supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='driverType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>path</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>handle</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtiofs</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </filesystem>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <tpm supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>tpm-tis</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>tpm-crb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendModel'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>emulator</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>external</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendVersion'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>2.0</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </tpm>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <redirdev supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='bus'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>usb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </redirdev>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <channel supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>pty</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>unix</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </channel>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <crypto supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>qemu</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendModel'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>builtin</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </crypto>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <interface supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>default</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>passt</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <panic supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>isa</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>hyperv</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </panic>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <features>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <gic supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <vmcoreinfo supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <genid supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <backingStoreInput supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <backup supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <async-teardown supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <ps2 supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <sev supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <sgx supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <hyperv supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='features'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>relaxed</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vapic</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>spinlocks</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vpindex</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>runtime</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>synic</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>stimer</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>reset</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vendor_id</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>frequencies</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>reenlightenment</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>tlbflush</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>ipi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>avic</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>emsr_bitmap</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>xmm_input</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </hyperv>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <launchSecurity supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </features>
Oct 02 08:01:15 compute-0 nova_compute[192567]: </domainCapabilities>
Oct 02 08:01:15 compute-0 nova_compute[192567]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.494 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.503 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct 02 08:01:15 compute-0 nova_compute[192567]: <domainCapabilities>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <path>/usr/libexec/qemu-kvm</path>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <domain>kvm</domain>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <arch>x86_64</arch>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <vcpu max='240'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <iothreads supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <os supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <enum name='firmware'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <loader supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>rom</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>pflash</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='readonly'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>yes</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>no</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='secure'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>no</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </loader>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </os>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <cpu>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='host-passthrough' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='hostPassthroughMigratable'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>on</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>off</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='maximum' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='maximumMigratable'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>on</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>off</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='host-model' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <vendor>AMD</vendor>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='x2apic'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='tsc-deadline'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='hypervisor'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='tsc_adjust'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='spec-ctrl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='stibp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='arch-capabilities'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='cmp_legacy'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='overflow-recov'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='succor'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='ibrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='amd-ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='virt-ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='lbrv'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='tsc-scale'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='vmcb-clean'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='flushbyasid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='pause-filter'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='pfthreshold'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='svme-addr-chk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='rdctl-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='mds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='pschange-mc-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='gds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='rfds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='disable' name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='custom' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cooperlake'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cooperlake-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cooperlake-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Dhyana-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Genoa'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amd-psfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='auto-ibrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='stibp-always-on'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Genoa-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amd-psfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='auto-ibrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='stibp-always-on'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Milan'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Milan-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Milan-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amd-psfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='stibp-always-on'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='GraniteRapids'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='prefetchiti'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='GraniteRapids-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='prefetchiti'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='GraniteRapids-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10-128'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10-256'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10-512'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='prefetchiti'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v6'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v7'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='KnightsMill'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512er'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512pf'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='KnightsMill-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512er'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512pf'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G4-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tbm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G5-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tbm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SierraForest'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cmpccxadd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SierraForest-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cmpccxadd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='athlon'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='athlon-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='core2duo'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='core2duo-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='coreduo'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='coreduo-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='n270'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='n270-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='phenom'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='phenom-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <memoryBacking supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <enum name='sourceType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>file</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>anonymous</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>memfd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </memoryBacking>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <disk supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='diskDevice'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>disk</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>cdrom</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>floppy</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>lun</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='bus'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>ide</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>fdc</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>scsi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>usb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>sata</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-non-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <graphics supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vnc</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>egl-headless</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>dbus</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </graphics>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <video supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='modelType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vga</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>cirrus</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>none</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>bochs</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>ramfb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </video>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <hostdev supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='mode'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>subsystem</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='startupPolicy'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>default</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>mandatory</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>requisite</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>optional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='subsysType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>usb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>pci</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>scsi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='capsType'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='pciBackend'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </hostdev>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <rng supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-non-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendModel'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>random</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>egd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>builtin</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <filesystem supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='driverType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>path</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>handle</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtiofs</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </filesystem>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <tpm supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>tpm-tis</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>tpm-crb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendModel'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>emulator</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>external</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendVersion'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>2.0</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </tpm>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <redirdev supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='bus'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>usb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </redirdev>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <channel supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>pty</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>unix</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </channel>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <crypto supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>qemu</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendModel'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>builtin</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </crypto>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <interface supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>default</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>passt</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <panic supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>isa</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>hyperv</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </panic>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <features>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <gic supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <vmcoreinfo supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <genid supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <backingStoreInput supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <backup supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <async-teardown supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <ps2 supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <sev supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <sgx supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <hyperv supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='features'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>relaxed</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vapic</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>spinlocks</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vpindex</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>runtime</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>synic</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>stimer</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>reset</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vendor_id</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>frequencies</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>reenlightenment</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>tlbflush</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>ipi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>avic</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>emsr_bitmap</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>xmm_input</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </hyperv>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <launchSecurity supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </features>
Oct 02 08:01:15 compute-0 nova_compute[192567]: </domainCapabilities>
Oct 02 08:01:15 compute-0 nova_compute[192567]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.568 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 02 08:01:15 compute-0 nova_compute[192567]: <domainCapabilities>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <path>/usr/libexec/qemu-kvm</path>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <domain>kvm</domain>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <arch>x86_64</arch>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <vcpu max='4096'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <iothreads supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <os supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <enum name='firmware'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>efi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <loader supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>rom</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>pflash</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='readonly'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>yes</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>no</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='secure'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>yes</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>no</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </loader>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </os>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <cpu>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='host-passthrough' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='hostPassthroughMigratable'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>on</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>off</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='maximum' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='maximumMigratable'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>on</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>off</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='host-model' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <vendor>AMD</vendor>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='x2apic'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='tsc-deadline'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='hypervisor'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='tsc_adjust'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='spec-ctrl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='stibp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='arch-capabilities'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='cmp_legacy'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='overflow-recov'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='succor'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='ibrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='amd-ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='virt-ssbd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='lbrv'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='tsc-scale'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='vmcb-clean'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='flushbyasid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='pause-filter'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='pfthreshold'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='svme-addr-chk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='rdctl-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='mds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='pschange-mc-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='gds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='require' name='rfds-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <feature policy='disable' name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <mode name='custom' supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Broadwell-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cascadelake-Server-v5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cooperlake'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cooperlake-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Cooperlake-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Denverton-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Dhyana-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Genoa'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amd-psfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='auto-ibrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='stibp-always-on'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Genoa-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amd-psfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='auto-ibrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='stibp-always-on'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Milan'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Milan-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Milan-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amd-psfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='no-nested-data-bp'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='null-sel-clr-base'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='stibp-always-on'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-Rome-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='EPYC-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='GraniteRapids'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='prefetchiti'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='GraniteRapids-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='prefetchiti'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='GraniteRapids-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10-128'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10-256'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx10-512'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='prefetchiti'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Haswell-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-noTSX'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v6'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Icelake-Server-v7'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='IvyBridge-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='KnightsMill'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512er'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512pf'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='KnightsMill-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4fmaps'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-4vnniw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512er'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512pf'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G4-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tbm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Opteron_G5-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fma4'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tbm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xop'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SapphireRapids-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='amx-tile'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-bf16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-fp16'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512-vpopcntdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bitalg'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vbmi2'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrc'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fzrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='la57'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='taa-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='tsx-ldtrk'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xfd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SierraForest'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cmpccxadd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='SierraForest-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ifma'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-ne-convert'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx-vnni-int8'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='bus-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cmpccxadd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fbsdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='fsrs'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ibrs-all'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mcdt-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pbrsb-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='psdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='sbdr-ssdp-no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='serialize'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vaes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='vpclmulqdq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Client-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='hle'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='rtm'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Skylake-Server-v5'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512bw'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512cd'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512dq'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512f'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='avx512vl'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='invpcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pcid'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='pku'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='mpx'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v2'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v3'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='core-capability'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='split-lock-detect'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='Snowridge-v4'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='cldemote'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='erms'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='gfni'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdir64b'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='movdiri'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='xsaves'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='athlon'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='athlon-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='core2duo'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='core2duo-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='coreduo'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='coreduo-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='n270'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='n270-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='ss'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='phenom'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <blockers model='phenom-v1'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnow'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <feature name='3dnowext'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </blockers>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </mode>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <memoryBacking supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <enum name='sourceType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>file</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>anonymous</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <value>memfd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </memoryBacking>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <disk supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='diskDevice'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>disk</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>cdrom</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>floppy</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>lun</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='bus'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>fdc</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>scsi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>usb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>sata</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-non-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <graphics supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vnc</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>egl-headless</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>dbus</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </graphics>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <video supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='modelType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vga</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>cirrus</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>none</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>bochs</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>ramfb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </video>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <hostdev supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='mode'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>subsystem</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='startupPolicy'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>default</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>mandatory</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>requisite</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>optional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='subsysType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>usb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>pci</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>scsi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='capsType'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='pciBackend'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </hostdev>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <rng supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtio-non-transitional</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendModel'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>random</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>egd</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>builtin</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <filesystem supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='driverType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>path</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>handle</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>virtiofs</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </filesystem>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <tpm supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>tpm-tis</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>tpm-crb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendModel'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>emulator</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>external</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendVersion'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>2.0</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </tpm>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <redirdev supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='bus'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>usb</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </redirdev>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <channel supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>pty</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>unix</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </channel>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <crypto supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='type'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>qemu</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendModel'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>builtin</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </crypto>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <interface supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='backendType'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>default</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>passt</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <panic supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='model'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>isa</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>hyperv</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </panic>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   <features>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <gic supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <vmcoreinfo supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <genid supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <backingStoreInput supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <backup supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <async-teardown supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <ps2 supported='yes'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <sev supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <sgx supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <hyperv supported='yes'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       <enum name='features'>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>relaxed</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vapic</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>spinlocks</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vpindex</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>runtime</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>synic</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>stimer</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>reset</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>vendor_id</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>frequencies</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>reenlightenment</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>tlbflush</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>ipi</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>avic</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>emsr_bitmap</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:         <value>xmm_input</value>
Oct 02 08:01:15 compute-0 nova_compute[192567]:       </enum>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     </hyperv>
Oct 02 08:01:15 compute-0 nova_compute[192567]:     <launchSecurity supported='no'/>
Oct 02 08:01:15 compute-0 nova_compute[192567]:   </features>
Oct 02 08:01:15 compute-0 nova_compute[192567]: </domainCapabilities>
Oct 02 08:01:15 compute-0 nova_compute[192567]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.635 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.636 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.636 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.636 2 INFO nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Secure Boot support detected
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.638 2 INFO nova.virt.libvirt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.639 2 INFO nova.virt.libvirt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.647 2 DEBUG nova.virt.libvirt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.663 2 INFO nova.virt.node [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Determined node identity e7f6698e-de2d-4705-8493-a3445ce0cf6e from /var/lib/nova/compute_id
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.692 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Verified node e7f6698e-de2d-4705-8493-a3445ce0cf6e matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.717 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.766 2 ERROR nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Could not retrieve compute node resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'e7f6698e-de2d-4705-8493-a3445ce0cf6e' not found: No resource provider with uuid e7f6698e-de2d-4705-8493-a3445ce0cf6e found  ", "request_id": "req-96f6ff36-f9c1-4a58-aee0-40987a777ce1"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'e7f6698e-de2d-4705-8493-a3445ce0cf6e' not found: No resource provider with uuid e7f6698e-de2d-4705-8493-a3445ce0cf6e found  ", "request_id": "req-96f6ff36-f9c1-4a58-aee0-40987a777ce1"}]}
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.785 2 DEBUG oslo_concurrency.lockutils [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.785 2 DEBUG oslo_concurrency.lockutils [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.786 2 DEBUG oslo_concurrency.lockutils [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.786 2 DEBUG nova.compute.resource_tracker [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.963 2 WARNING nova.virt.libvirt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.964 2 DEBUG nova.compute.resource_tracker [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6188MB free_disk=73.66900253295898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.965 2 DEBUG oslo_concurrency.lockutils [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:01:15 compute-0 nova_compute[192567]: 2025-10-02 08:01:15.965 2 DEBUG oslo_concurrency.lockutils [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.511 2 ERROR nova.compute.resource_tracker [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'e7f6698e-de2d-4705-8493-a3445ce0cf6e' not found: No resource provider with uuid e7f6698e-de2d-4705-8493-a3445ce0cf6e found  ", "request_id": "req-5275d078-074b-4333-b277-10b92940823b"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'e7f6698e-de2d-4705-8493-a3445ce0cf6e' not found: No resource provider with uuid e7f6698e-de2d-4705-8493-a3445ce0cf6e found  ", "request_id": "req-5275d078-074b-4333-b277-10b92940823b"}]}
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.512 2 DEBUG nova.compute.resource_tracker [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.512 2 DEBUG nova.compute.resource_tracker [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.648 2 INFO nova.scheduler.client.report [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [req-af196ba8-4340-44d9-aacf-b631986de763] Created resource provider record via placement API for resource provider with UUID e7f6698e-de2d-4705-8493-a3445ce0cf6e and name compute-0.ctlplane.example.com.
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.689 2 DEBUG nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct 02 08:01:16 compute-0 nova_compute[192567]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.689 2 INFO nova.virt.libvirt.host [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] kernel doesn't support AMD SEV
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.690 2 DEBUG nova.compute.provider_tree [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.691 2 DEBUG nova.virt.libvirt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.737 2 DEBUG nova.scheduler.client.report [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Updated inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.738 2 DEBUG nova.compute.provider_tree [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Updating resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.738 2 DEBUG nova.compute.provider_tree [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.834 2 DEBUG nova.compute.provider_tree [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Updating resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.855 2 DEBUG nova.compute.resource_tracker [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.855 2 DEBUG oslo_concurrency.lockutils [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.890s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.855 2 DEBUG nova.service [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.919 2 DEBUG nova.service [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Oct 02 08:01:16 compute-0 nova_compute[192567]: 2025-10-02 08:01:16.919 2 DEBUG nova.servicegroup.drivers.db [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Oct 02 08:01:19 compute-0 sshd-session[192863]: Accepted publickey for zuul from 192.168.122.30 port 45122 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 08:01:19 compute-0 systemd-logind[827]: New session 28 of user zuul.
Oct 02 08:01:19 compute-0 systemd[1]: Started Session 28 of User zuul.
Oct 02 08:01:19 compute-0 sshd-session[192863]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 08:01:20 compute-0 python3.9[193016]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 02 08:01:22 compute-0 podman[193097]: 2025-10-02 08:01:22.207261729 +0000 UTC m=+0.103701653 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 02 08:01:22 compute-0 podman[193099]: 2025-10-02 08:01:22.207376112 +0000 UTC m=+0.102901479 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Oct 02 08:01:22 compute-0 podman[193098]: 2025-10-02 08:01:22.243384483 +0000 UTC m=+0.139716734 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:01:22 compute-0 sudo[193234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpvlptvfhhnwgmbvvkomtuqiurraghoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392081.6872768-52-21812923699846/AnsiballZ_systemd_service.py'
Oct 02 08:01:22 compute-0 sudo[193234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:22 compute-0 python3.9[193236]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 08:01:22 compute-0 systemd[1]: Reloading.
Oct 02 08:01:22 compute-0 systemd-rc-local-generator[193263]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 08:01:22 compute-0 systemd-sysv-generator[193266]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 08:01:23 compute-0 sudo[193234]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:24 compute-0 python3.9[193421]: ansible-ansible.builtin.service_facts Invoked
Oct 02 08:01:24 compute-0 network[193438]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 02 08:01:24 compute-0 network[193439]: 'network-scripts' will be removed from distribution in near future.
Oct 02 08:01:24 compute-0 network[193440]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 02 08:01:24 compute-0 podman[193445]: 2025-10-02 08:01:24.368295283 +0000 UTC m=+0.102926819 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:01:30 compute-0 sudo[193735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dojsovffuetwqtyossrsaurxxlyqcrre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392089.8887744-90-45361497266596/AnsiballZ_systemd_service.py'
Oct 02 08:01:30 compute-0 sudo[193735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:30 compute-0 python3.9[193737]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 08:01:30 compute-0 sudo[193735]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:31 compute-0 sudo[193888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qauchkhgzpxljjrxwlhubddhhzgoltma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392091.076141-110-141274502047205/AnsiballZ_file.py'
Oct 02 08:01:31 compute-0 sudo[193888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:31 compute-0 python3.9[193890]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:01:31 compute-0 sudo[193888]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:31 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 02 08:01:32 compute-0 sudo[194041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkudprtrgvbkpjcpulziwvterlauhvjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392092.185343-126-136162438583282/AnsiballZ_file.py'
Oct 02 08:01:32 compute-0 sudo[194041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:32 compute-0 python3.9[194043]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:01:32 compute-0 sudo[194041]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:33 compute-0 sudo[194193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drlutuxjavcjyhokpsincvbpumuujmot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392093.2084029-144-76982205159012/AnsiballZ_command.py'
Oct 02 08:01:33 compute-0 sudo[194193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:34 compute-0 python3.9[194195]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:01:34 compute-0 sudo[194193]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:35 compute-0 python3.9[194347]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 02 08:01:35 compute-0 sudo[194497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjcbhozlejxmlmftpchmvpseldcsyvwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392095.4277732-180-280130801079801/AnsiballZ_systemd_service.py'
Oct 02 08:01:35 compute-0 sudo[194497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:36 compute-0 python3.9[194499]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 08:01:36 compute-0 systemd[1]: Reloading.
Oct 02 08:01:36 compute-0 systemd-sysv-generator[194530]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 08:01:36 compute-0 systemd-rc-local-generator[194526]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 08:01:36 compute-0 sudo[194497]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:37 compute-0 sudo[194684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsgmbyemclydktaufxmacvryxjwpvuwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392096.8341203-196-239849897341894/AnsiballZ_command.py'
Oct 02 08:01:37 compute-0 sudo[194684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:37 compute-0 python3.9[194686]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:01:37 compute-0 sudo[194684]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:38 compute-0 sudo[194837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klwhfbxldwbizzqvjihmwyhhgxgbacvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392097.8724356-214-69449069236680/AnsiballZ_file.py'
Oct 02 08:01:38 compute-0 sudo[194837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:38 compute-0 python3.9[194839]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:01:38 compute-0 sudo[194837]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:39 compute-0 python3.9[194989]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:01:40 compute-0 python3.9[195141]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:01:41 compute-0 python3.9[195262]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759392099.7619226-246-163648392963316/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:01:42 compute-0 sudo[195412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jenjduehbfgdbtnijqgqhfabvrbgyndr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392101.4792094-276-9447385098869/AnsiballZ_group.py'
Oct 02 08:01:42 compute-0 sudo[195412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:42 compute-0 python3.9[195414]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Oct 02 08:01:42 compute-0 sudo[195412]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:43 compute-0 sudo[195564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iajxxfvaovpmxkjmgnnhxzakymvtmloi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392102.6770883-298-124274526696734/AnsiballZ_getent.py'
Oct 02 08:01:43 compute-0 sudo[195564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:43 compute-0 python3.9[195566]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Oct 02 08:01:43 compute-0 sudo[195564]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:44 compute-0 sudo[195717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pygsjpyzbrevmgfreifafkaemuixwlco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392103.8006408-314-163642001808419/AnsiballZ_group.py'
Oct 02 08:01:44 compute-0 sudo[195717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:44 compute-0 python3.9[195719]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 02 08:01:44 compute-0 groupadd[195720]: group added to /etc/group: name=ceilometer, GID=42405
Oct 02 08:01:44 compute-0 groupadd[195720]: group added to /etc/gshadow: name=ceilometer
Oct 02 08:01:44 compute-0 groupadd[195720]: new group: name=ceilometer, GID=42405
Oct 02 08:01:44 compute-0 sudo[195717]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:45 compute-0 auditd[709]: Audit daemon rotating log files
Oct 02 08:01:45 compute-0 sudo[195875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jofsjgomkqocfxmgiclnoinorvlyvier ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392104.7837245-330-169876466963213/AnsiballZ_user.py'
Oct 02 08:01:45 compute-0 sudo[195875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:01:45 compute-0 python3.9[195877]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 02 08:01:45 compute-0 useradd[195879]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Oct 02 08:01:45 compute-0 useradd[195879]: add 'ceilometer' to group 'libvirt'
Oct 02 08:01:45 compute-0 useradd[195879]: add 'ceilometer' to shadow group 'libvirt'
Oct 02 08:01:45 compute-0 sudo[195875]: pam_unix(sudo:session): session closed for user root
Oct 02 08:01:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:01:45.960 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:01:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:01:45.961 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:01:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:01:45.961 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:01:47 compute-0 python3.9[196035]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:01:47 compute-0 python3.9[196156]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759392106.6009128-382-10291685410123/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:01:48 compute-0 python3.9[196306]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:01:49 compute-0 python3.9[196427]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759392107.7988393-382-22428807248115/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:01:49 compute-0 python3.9[196577]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:01:50 compute-0 python3.9[196698]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759392109.2528203-382-31449190256306/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:01:51 compute-0 python3.9[196848]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:01:52 compute-0 python3.9[197000]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:01:53 compute-0 podman[197128]: 2025-10-02 08:01:53.104998817 +0000 UTC m=+0.105499027 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 02 08:01:53 compute-0 podman[197126]: 2025-10-02 08:01:53.162634116 +0000 UTC m=+0.164370324 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Oct 02 08:01:53 compute-0 podman[197127]: 2025-10-02 08:01:53.186991747 +0000 UTC m=+0.188203739 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:01:53 compute-0 python3.9[197186]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:01:53 compute-0 python3.9[197333]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759392112.6405706-500-43423587955209/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:01:54 compute-0 python3.9[197483]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:01:55 compute-0 python3.9[197559]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:01:55 compute-0 podman[197560]: 2025-10-02 08:01:55.220269817 +0000 UTC m=+0.072905168 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:01:55 compute-0 python3.9[197729]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:01:56 compute-0 python3.9[197850]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759392115.2982557-500-255166728593842/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:01:57 compute-0 python3.9[198000]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:01:57 compute-0 python3.9[198121]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759392116.6782777-500-121300148422631/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:01:58 compute-0 python3.9[198271]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:01:59 compute-0 python3.9[198392]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759392118.0781324-500-164108994383982/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:01:59 compute-0 nova_compute[192567]: 2025-10-02 08:01:59.921 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:01:59 compute-0 nova_compute[192567]: 2025-10-02 08:01:59.942 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:01:59 compute-0 python3.9[198542]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:02:00 compute-0 python3.9[198663]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759392119.419994-500-276952931175303/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:01 compute-0 python3.9[198813]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:02:02 compute-0 python3.9[198934]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759392120.8359547-500-210734074310096/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:02 compute-0 python3.9[199084]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:02:03 compute-0 python3.9[199205]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759392122.3133621-500-160494166789557/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:04 compute-0 python3.9[199355]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:02:05 compute-0 python3.9[199476]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759392123.7257633-500-240086135079686/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:06 compute-0 python3.9[199626]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:02:06 compute-0 python3.9[199747]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759392125.3221078-500-175960066592225/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:07 compute-0 python3.9[199897]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:02:08 compute-0 python3.9[200018]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759392126.8927126-500-1987928187772/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:08 compute-0 python3.9[200168]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:02:09 compute-0 python3.9[200244]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:10 compute-0 python3.9[200394]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:02:11 compute-0 python3.9[200470]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:11 compute-0 python3.9[200620]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:02:12 compute-0 python3.9[200696]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:13 compute-0 sudo[200846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxklzxbfyhazpvvofapekjvdzcphnvjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392132.6679475-878-137909722167444/AnsiballZ_file.py'
Oct 02 08:02:13 compute-0 sudo[200846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:13 compute-0 python3.9[200848]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:13 compute-0 sudo[200846]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:13 compute-0 sudo[200998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbdeusvzahskhbvmuiqothesaywytyil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392133.5283833-894-9230258126500/AnsiballZ_file.py'
Oct 02 08:02:13 compute-0 sudo[200998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:14 compute-0 python3.9[201000]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:14 compute-0 sudo[200998]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.627 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.627 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.627 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.638 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.638 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.639 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.639 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.639 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.639 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.640 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.640 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.640 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:02:14 compute-0 sudo[201150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akxyjbzjbkndgdgaahsdvetvykovwvuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392134.3460405-910-75551451575228/AnsiballZ_file.py'
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.659 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.659 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.660 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.660 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:02:14 compute-0 sudo[201150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.877 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.879 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6154MB free_disk=73.66923904418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.879 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.879 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:02:14 compute-0 python3.9[201152]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:02:14 compute-0 sudo[201150]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.961 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.961 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:02:14 compute-0 nova_compute[192567]: 2025-10-02 08:02:14.992 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:02:15 compute-0 nova_compute[192567]: 2025-10-02 08:02:15.007 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:02:15 compute-0 nova_compute[192567]: 2025-10-02 08:02:15.009 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:02:15 compute-0 nova_compute[192567]: 2025-10-02 08:02:15.010 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:02:15 compute-0 sudo[201302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhbjpxoigspynjbilbdivfbzwovtepnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392135.1164227-926-91735852461581/AnsiballZ_systemd_service.py'
Oct 02 08:02:15 compute-0 sudo[201302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:15 compute-0 python3.9[201304]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 08:02:16 compute-0 systemd[1]: Reloading.
Oct 02 08:02:16 compute-0 systemd-sysv-generator[201335]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 08:02:16 compute-0 systemd-rc-local-generator[201330]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 08:02:16 compute-0 systemd[1]: Listening on Podman API Socket.
Oct 02 08:02:16 compute-0 sudo[201302]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:17 compute-0 sudo[201493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxkvuanvpugrohdfyslhnznoxgevpgbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392136.8475502-944-250715098403779/AnsiballZ_stat.py'
Oct 02 08:02:17 compute-0 sudo[201493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:17 compute-0 python3.9[201495]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:02:17 compute-0 sudo[201493]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:17 compute-0 sudo[201616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vannfuimzirbdsqtuotbhpkytnwtpimx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392136.8475502-944-250715098403779/AnsiballZ_copy.py'
Oct 02 08:02:17 compute-0 sudo[201616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:17 compute-0 python3.9[201618]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759392136.8475502-944-250715098403779/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:02:18 compute-0 sudo[201616]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:18 compute-0 sudo[201768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcqjqsmvtkbhagsigtjzirrpthfqgyei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392138.4094896-978-15451325941421/AnsiballZ_container_config_data.py'
Oct 02 08:02:18 compute-0 sudo[201768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:19 compute-0 python3.9[201770]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Oct 02 08:02:19 compute-0 sudo[201768]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:19 compute-0 sudo[201920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wujehrabzqskqlhdpifymwdxtjqstarc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392139.4705684-996-170836350213583/AnsiballZ_container_config_hash.py'
Oct 02 08:02:19 compute-0 sudo[201920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:20 compute-0 python3.9[201922]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 02 08:02:20 compute-0 sudo[201920]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:21 compute-0 sudo[202072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itjrvkttperksrnzfrzoymjrjqailpvb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759392140.5937974-1016-158799558828577/AnsiballZ_edpm_container_manage.py'
Oct 02 08:02:21 compute-0 sudo[202072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:21 compute-0 python3[202074]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 02 08:02:23 compute-0 podman[202087]: 2025-10-02 08:02:23.002728974 +0000 UTC m=+1.447871784 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct 02 08:02:23 compute-0 podman[202186]: 2025-10-02 08:02:23.222308746 +0000 UTC m=+0.072601721 container create 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Oct 02 08:02:23 compute-0 podman[202186]: 2025-10-02 08:02:23.19096031 +0000 UTC m=+0.041253335 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct 02 08:02:23 compute-0 python3[202074]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Oct 02 08:02:23 compute-0 sudo[202072]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:24 compute-0 podman[202348]: 2025-10-02 08:02:24.059403543 +0000 UTC m=+0.088541276 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 02 08:02:24 compute-0 podman[202350]: 2025-10-02 08:02:24.073293226 +0000 UTC m=+0.088158835 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, tcib_managed=true)
Oct 02 08:02:24 compute-0 sudo[202419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvmpucijoaedbsfveyyccfdptlyvmzmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392143.647189-1032-152330753300671/AnsiballZ_stat.py'
Oct 02 08:02:24 compute-0 sudo[202419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:24 compute-0 podman[202349]: 2025-10-02 08:02:24.127511483 +0000 UTC m=+0.146008385 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 02 08:02:24 compute-0 python3.9[202430]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:02:24 compute-0 sudo[202419]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:25 compute-0 sudo[202585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvyfboohycscrfquzkdkjuuirjokkfmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392144.6720407-1050-2648595491822/AnsiballZ_file.py'
Oct 02 08:02:25 compute-0 sudo[202585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:25 compute-0 python3.9[202587]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:25 compute-0 sudo[202585]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:25 compute-0 sudo[202749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dktroztfbnkvohcgllecwuwqjdlssmal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392145.3718448-1050-204112061171519/AnsiballZ_copy.py'
Oct 02 08:02:25 compute-0 sudo[202749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:25 compute-0 podman[202710]: 2025-10-02 08:02:25.985044083 +0000 UTC m=+0.122058579 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 02 08:02:26 compute-0 python3.9[202756]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759392145.3718448-1050-204112061171519/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:26 compute-0 sudo[202749]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:26 compute-0 sudo[202832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctyiwshiaxyyrivycuhfbagsogngfgye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392145.3718448-1050-204112061171519/AnsiballZ_systemd.py'
Oct 02 08:02:26 compute-0 sudo[202832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:27 compute-0 python3.9[202834]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 08:02:27 compute-0 systemd[1]: Reloading.
Oct 02 08:02:27 compute-0 systemd-rc-local-generator[202861]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 08:02:27 compute-0 systemd-sysv-generator[202864]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 08:02:27 compute-0 sudo[202832]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:27 compute-0 sudo[202943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acmpztushejdniknluovyhgyddyuipch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392145.3718448-1050-204112061171519/AnsiballZ_systemd.py'
Oct 02 08:02:27 compute-0 sudo[202943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:28 compute-0 python3.9[202945]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 08:02:28 compute-0 systemd[1]: Reloading.
Oct 02 08:02:28 compute-0 systemd-rc-local-generator[202973]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 08:02:28 compute-0 systemd-sysv-generator[202977]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 08:02:28 compute-0 systemd[1]: Starting podman_exporter container...
Oct 02 08:02:28 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:02:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0858a3a7389f437996e8399fc0b5c30f015f23367ab63915b4b07d60e2a2afca/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 02 08:02:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0858a3a7389f437996e8399fc0b5c30f015f23367ab63915b4b07d60e2a2afca/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 02 08:02:28 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3.
Oct 02 08:02:28 compute-0 podman[202985]: 2025-10-02 08:02:28.877279849 +0000 UTC m=+0.192512611 container init 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:02:28 compute-0 podman_exporter[203000]: ts=2025-10-02T08:02:28.902Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct 02 08:02:28 compute-0 podman_exporter[203000]: ts=2025-10-02T08:02:28.902Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct 02 08:02:28 compute-0 podman_exporter[203000]: ts=2025-10-02T08:02:28.903Z caller=handler.go:94 level=info msg="enabled collectors"
Oct 02 08:02:28 compute-0 podman_exporter[203000]: ts=2025-10-02T08:02:28.903Z caller=handler.go:105 level=info collector=container
Oct 02 08:02:28 compute-0 podman[202985]: 2025-10-02 08:02:28.907373126 +0000 UTC m=+0.222605858 container start 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:02:28 compute-0 podman[202985]: podman_exporter
Oct 02 08:02:28 compute-0 systemd[1]: Starting Podman API Service...
Oct 02 08:02:28 compute-0 systemd[1]: Started Podman API Service.
Oct 02 08:02:28 compute-0 systemd[1]: Started podman_exporter container.
Oct 02 08:02:28 compute-0 podman[203011]: time="2025-10-02T08:02:28Z" level=info msg="/usr/bin/podman filtering at log level info"
Oct 02 08:02:28 compute-0 podman[203011]: time="2025-10-02T08:02:28Z" level=info msg="Setting parallel job count to 25"
Oct 02 08:02:28 compute-0 podman[203011]: time="2025-10-02T08:02:28Z" level=info msg="Using sqlite as database backend"
Oct 02 08:02:28 compute-0 podman[203011]: time="2025-10-02T08:02:28Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Oct 02 08:02:28 compute-0 podman[203011]: time="2025-10-02T08:02:28Z" level=info msg="Using systemd socket activation to determine API endpoint"
Oct 02 08:02:28 compute-0 podman[203011]: time="2025-10-02T08:02:28Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Oct 02 08:02:28 compute-0 sudo[202943]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:28 compute-0 podman[203011]: @ - - [02/Oct/2025:08:02:28 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct 02 08:02:28 compute-0 podman[203011]: time="2025-10-02T08:02:28Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:02:28 compute-0 podman[203009]: 2025-10-02 08:02:28.997719217 +0000 UTC m=+0.077330317 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:02:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:02:28 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 16556 "" "Go-http-client/1.1"
Oct 02 08:02:29 compute-0 podman_exporter[203000]: ts=2025-10-02T08:02:29.007Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct 02 08:02:29 compute-0 podman_exporter[203000]: ts=2025-10-02T08:02:29.008Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct 02 08:02:29 compute-0 podman_exporter[203000]: ts=2025-10-02T08:02:29.008Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct 02 08:02:29 compute-0 systemd[1]: 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3-345af8b0e7793e4a.service: Main process exited, code=exited, status=1/FAILURE
Oct 02 08:02:29 compute-0 systemd[1]: 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3-345af8b0e7793e4a.service: Failed with result 'exit-code'.
Oct 02 08:02:29 compute-0 sudo[203197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-insfyroqnllkqhcbmnyzmqudsxelmywo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392149.1971052-1098-166255888521702/AnsiballZ_systemd.py'
Oct 02 08:02:29 compute-0 sudo[203197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:29 compute-0 python3.9[203199]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 08:02:29 compute-0 systemd[1]: Stopping podman_exporter container...
Oct 02 08:02:30 compute-0 podman[203011]: @ - - [02/Oct/2025:08:02:28 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Oct 02 08:02:30 compute-0 systemd[1]: libpod-922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3.scope: Deactivated successfully.
Oct 02 08:02:30 compute-0 podman[203203]: 2025-10-02 08:02:30.0216852 +0000 UTC m=+0.067130751 container died 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:02:30 compute-0 systemd[1]: 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3-345af8b0e7793e4a.timer: Deactivated successfully.
Oct 02 08:02:30 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3.
Oct 02 08:02:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3-userdata-shm.mount: Deactivated successfully.
Oct 02 08:02:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-0858a3a7389f437996e8399fc0b5c30f015f23367ab63915b4b07d60e2a2afca-merged.mount: Deactivated successfully.
Oct 02 08:02:30 compute-0 podman[203203]: 2025-10-02 08:02:30.201387671 +0000 UTC m=+0.246833202 container cleanup 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:02:30 compute-0 podman[203203]: podman_exporter
Oct 02 08:02:30 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 02 08:02:30 compute-0 podman[203229]: podman_exporter
Oct 02 08:02:30 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Oct 02 08:02:30 compute-0 systemd[1]: Stopped podman_exporter container.
Oct 02 08:02:30 compute-0 systemd[1]: Starting podman_exporter container...
Oct 02 08:02:30 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:02:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0858a3a7389f437996e8399fc0b5c30f015f23367ab63915b4b07d60e2a2afca/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 02 08:02:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0858a3a7389f437996e8399fc0b5c30f015f23367ab63915b4b07d60e2a2afca/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 02 08:02:30 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3.
Oct 02 08:02:30 compute-0 podman[203242]: 2025-10-02 08:02:30.489514247 +0000 UTC m=+0.161772585 container init 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:02:30 compute-0 podman_exporter[203257]: ts=2025-10-02T08:02:30.515Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct 02 08:02:30 compute-0 podman_exporter[203257]: ts=2025-10-02T08:02:30.516Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct 02 08:02:30 compute-0 podman_exporter[203257]: ts=2025-10-02T08:02:30.516Z caller=handler.go:94 level=info msg="enabled collectors"
Oct 02 08:02:30 compute-0 podman_exporter[203257]: ts=2025-10-02T08:02:30.516Z caller=handler.go:105 level=info collector=container
Oct 02 08:02:30 compute-0 podman[203011]: @ - - [02/Oct/2025:08:02:30 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct 02 08:02:30 compute-0 podman[203011]: time="2025-10-02T08:02:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:02:30 compute-0 podman[203242]: 2025-10-02 08:02:30.533609288 +0000 UTC m=+0.205867566 container start 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:02:30 compute-0 podman[203242]: podman_exporter
Oct 02 08:02:30 compute-0 podman[203011]: @ - - [02/Oct/2025:08:02:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 16558 "" "Go-http-client/1.1"
Oct 02 08:02:30 compute-0 podman_exporter[203257]: ts=2025-10-02T08:02:30.543Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct 02 08:02:30 compute-0 podman_exporter[203257]: ts=2025-10-02T08:02:30.544Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct 02 08:02:30 compute-0 podman_exporter[203257]: ts=2025-10-02T08:02:30.545Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct 02 08:02:30 compute-0 systemd[1]: Started podman_exporter container.
Oct 02 08:02:30 compute-0 sudo[203197]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:30 compute-0 podman[203267]: 2025-10-02 08:02:30.634137447 +0000 UTC m=+0.084204581 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:02:31 compute-0 sudo[203443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehxbjhsmthagkwqqilahdxmtmhlbrpdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392150.879262-1114-181094963006713/AnsiballZ_stat.py'
Oct 02 08:02:31 compute-0 sudo[203443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:31 compute-0 python3.9[203445]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:02:31 compute-0 sudo[203443]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:31 compute-0 sudo[203566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dscklymijsgaardzdexdjuelgzkujxfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392150.879262-1114-181094963006713/AnsiballZ_copy.py'
Oct 02 08:02:31 compute-0 sudo[203566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:32 compute-0 python3.9[203568]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759392150.879262-1114-181094963006713/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 02 08:02:32 compute-0 sudo[203566]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:33 compute-0 sudo[203718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywyglqpeqyxhxargpwgiyymyjmkytplq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392152.65572-1148-1186298170373/AnsiballZ_container_config_data.py'
Oct 02 08:02:33 compute-0 sudo[203718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:33 compute-0 python3.9[203720]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Oct 02 08:02:33 compute-0 sudo[203718]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:33 compute-0 sudo[203870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foypdlkvtkpjgstccsxkzmwankwlrbys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392153.5548103-1166-257867071242277/AnsiballZ_container_config_hash.py'
Oct 02 08:02:33 compute-0 sudo[203870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:34 compute-0 python3.9[203872]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 02 08:02:34 compute-0 sudo[203870]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:34 compute-0 sudo[204022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pianrrltrvxpwgduznlumstybnupdrmp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759392154.5080676-1186-213216671154452/AnsiballZ_edpm_container_manage.py'
Oct 02 08:02:34 compute-0 sudo[204022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:35 compute-0 python3[204024]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 02 08:02:37 compute-0 podman[204037]: 2025-10-02 08:02:37.645672943 +0000 UTC m=+2.363831146 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct 02 08:02:37 compute-0 podman[204134]: 2025-10-02 08:02:37.8425851 +0000 UTC m=+0.068033958 container create a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Oct 02 08:02:37 compute-0 podman[204134]: 2025-10-02 08:02:37.803761112 +0000 UTC m=+0.029209980 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct 02 08:02:37 compute-0 python3[204024]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct 02 08:02:38 compute-0 sudo[204022]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:38 compute-0 sudo[204323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfaelpihtsyheottentapbdccffjcciy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392158.3415449-1202-45837737164906/AnsiballZ_stat.py'
Oct 02 08:02:38 compute-0 sudo[204323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:38 compute-0 python3.9[204325]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:02:38 compute-0 sudo[204323]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:39 compute-0 sudo[204477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdycdhehjmntlvtqkmuchdfpsybikuqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392159.2978745-1220-29592181998394/AnsiballZ_file.py'
Oct 02 08:02:39 compute-0 sudo[204477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:39 compute-0 python3.9[204479]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:39 compute-0 sudo[204477]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:40 compute-0 sudo[204628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsxxlclmssjyougwnxrhrzovlcrmlnfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392160.0497234-1220-264968482724527/AnsiballZ_copy.py'
Oct 02 08:02:40 compute-0 sudo[204628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:40 compute-0 python3.9[204630]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759392160.0497234-1220-264968482724527/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:40 compute-0 sudo[204628]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:41 compute-0 sudo[204704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccevtdzctmaohiozsemjthmtnosejyrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392160.0497234-1220-264968482724527/AnsiballZ_systemd.py'
Oct 02 08:02:41 compute-0 sudo[204704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:41 compute-0 python3.9[204706]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 02 08:02:41 compute-0 systemd[1]: Reloading.
Oct 02 08:02:41 compute-0 systemd-rc-local-generator[204735]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 08:02:41 compute-0 systemd-sysv-generator[204739]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 08:02:41 compute-0 sudo[204704]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:42 compute-0 sudo[204815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfkflhoqhhnjfhcypzeiwxbapfkoksuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392160.0497234-1220-264968482724527/AnsiballZ_systemd.py'
Oct 02 08:02:42 compute-0 sudo[204815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:42 compute-0 python3.9[204817]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 02 08:02:42 compute-0 systemd[1]: Reloading.
Oct 02 08:02:42 compute-0 systemd-rc-local-generator[204847]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 02 08:02:42 compute-0 systemd-sysv-generator[204851]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 02 08:02:43 compute-0 systemd[1]: Starting openstack_network_exporter container...
Oct 02 08:02:43 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:02:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2716f6c71e3176cf323fcccb243f9f31a0f58b7b1e53d485f126dfcf5ae2d49b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 02 08:02:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2716f6c71e3176cf323fcccb243f9f31a0f58b7b1e53d485f126dfcf5ae2d49b/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 02 08:02:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2716f6c71e3176cf323fcccb243f9f31a0f58b7b1e53d485f126dfcf5ae2d49b/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 02 08:02:43 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9.
Oct 02 08:02:43 compute-0 podman[204857]: 2025-10-02 08:02:43.173624233 +0000 UTC m=+0.144191487 container init a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, distribution-scope=public, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Oct 02 08:02:43 compute-0 openstack_network_exporter[204872]: INFO    08:02:43 main.go:48: registering *bridge.Collector
Oct 02 08:02:43 compute-0 openstack_network_exporter[204872]: INFO    08:02:43 main.go:48: registering *coverage.Collector
Oct 02 08:02:43 compute-0 openstack_network_exporter[204872]: INFO    08:02:43 main.go:48: registering *datapath.Collector
Oct 02 08:02:43 compute-0 openstack_network_exporter[204872]: INFO    08:02:43 main.go:48: registering *iface.Collector
Oct 02 08:02:43 compute-0 openstack_network_exporter[204872]: INFO    08:02:43 main.go:48: registering *memory.Collector
Oct 02 08:02:43 compute-0 openstack_network_exporter[204872]: INFO    08:02:43 main.go:48: registering *ovnnorthd.Collector
Oct 02 08:02:43 compute-0 openstack_network_exporter[204872]: INFO    08:02:43 main.go:48: registering *ovn.Collector
Oct 02 08:02:43 compute-0 openstack_network_exporter[204872]: INFO    08:02:43 main.go:48: registering *ovsdbserver.Collector
Oct 02 08:02:43 compute-0 openstack_network_exporter[204872]: INFO    08:02:43 main.go:48: registering *pmd_perf.Collector
Oct 02 08:02:43 compute-0 openstack_network_exporter[204872]: INFO    08:02:43 main.go:48: registering *pmd_rxq.Collector
Oct 02 08:02:43 compute-0 openstack_network_exporter[204872]: INFO    08:02:43 main.go:48: registering *vswitch.Collector
Oct 02 08:02:43 compute-0 openstack_network_exporter[204872]: NOTICE  08:02:43 main.go:76: listening on https://:9105/metrics
Oct 02 08:02:43 compute-0 podman[204857]: 2025-10-02 08:02:43.208268802 +0000 UTC m=+0.178836036 container start a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 02 08:02:43 compute-0 podman[204857]: openstack_network_exporter
Oct 02 08:02:43 compute-0 systemd[1]: Started openstack_network_exporter container.
Oct 02 08:02:43 compute-0 sudo[204815]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:43 compute-0 podman[204882]: 2025-10-02 08:02:43.31941824 +0000 UTC m=+0.102075197 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1755695350, io.openshift.expose-services=)
Oct 02 08:02:44 compute-0 sudo[205054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sntqleoqwhvmxxpkyyoslhszjwqzdich ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392163.6676242-1268-44619689905942/AnsiballZ_systemd.py'
Oct 02 08:02:44 compute-0 sudo[205054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:44 compute-0 python3.9[205056]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 02 08:02:44 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Oct 02 08:02:44 compute-0 systemd[1]: libpod-a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9.scope: Deactivated successfully.
Oct 02 08:02:44 compute-0 podman[205060]: 2025-10-02 08:02:44.48854017 +0000 UTC m=+0.063901369 container died a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7)
Oct 02 08:02:44 compute-0 systemd[1]: a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9-689eaf281ce2bcb9.timer: Deactivated successfully.
Oct 02 08:02:44 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9.
Oct 02 08:02:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9-userdata-shm.mount: Deactivated successfully.
Oct 02 08:02:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-2716f6c71e3176cf323fcccb243f9f31a0f58b7b1e53d485f126dfcf5ae2d49b-merged.mount: Deactivated successfully.
Oct 02 08:02:45 compute-0 podman[205060]: 2025-10-02 08:02:45.19587254 +0000 UTC m=+0.771233759 container cleanup a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 02 08:02:45 compute-0 podman[205060]: openstack_network_exporter
Oct 02 08:02:45 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 02 08:02:45 compute-0 podman[205088]: openstack_network_exporter
Oct 02 08:02:45 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Oct 02 08:02:45 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Oct 02 08:02:45 compute-0 systemd[1]: Starting openstack_network_exporter container...
Oct 02 08:02:45 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:02:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2716f6c71e3176cf323fcccb243f9f31a0f58b7b1e53d485f126dfcf5ae2d49b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 02 08:02:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2716f6c71e3176cf323fcccb243f9f31a0f58b7b1e53d485f126dfcf5ae2d49b/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 02 08:02:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2716f6c71e3176cf323fcccb243f9f31a0f58b7b1e53d485f126dfcf5ae2d49b/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 02 08:02:45 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9.
Oct 02 08:02:45 compute-0 podman[205101]: 2025-10-02 08:02:45.481978172 +0000 UTC m=+0.165610143 container init a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64)
Oct 02 08:02:45 compute-0 openstack_network_exporter[205118]: INFO    08:02:45 main.go:48: registering *bridge.Collector
Oct 02 08:02:45 compute-0 openstack_network_exporter[205118]: INFO    08:02:45 main.go:48: registering *coverage.Collector
Oct 02 08:02:45 compute-0 openstack_network_exporter[205118]: INFO    08:02:45 main.go:48: registering *datapath.Collector
Oct 02 08:02:45 compute-0 openstack_network_exporter[205118]: INFO    08:02:45 main.go:48: registering *iface.Collector
Oct 02 08:02:45 compute-0 openstack_network_exporter[205118]: INFO    08:02:45 main.go:48: registering *memory.Collector
Oct 02 08:02:45 compute-0 openstack_network_exporter[205118]: INFO    08:02:45 main.go:48: registering *ovnnorthd.Collector
Oct 02 08:02:45 compute-0 openstack_network_exporter[205118]: INFO    08:02:45 main.go:48: registering *ovn.Collector
Oct 02 08:02:45 compute-0 openstack_network_exporter[205118]: INFO    08:02:45 main.go:48: registering *ovsdbserver.Collector
Oct 02 08:02:45 compute-0 openstack_network_exporter[205118]: INFO    08:02:45 main.go:48: registering *pmd_perf.Collector
Oct 02 08:02:45 compute-0 openstack_network_exporter[205118]: INFO    08:02:45 main.go:48: registering *pmd_rxq.Collector
Oct 02 08:02:45 compute-0 openstack_network_exporter[205118]: INFO    08:02:45 main.go:48: registering *vswitch.Collector
Oct 02 08:02:45 compute-0 openstack_network_exporter[205118]: NOTICE  08:02:45 main.go:76: listening on https://:9105/metrics
Oct 02 08:02:45 compute-0 podman[205101]: 2025-10-02 08:02:45.526443486 +0000 UTC m=+0.210075477 container start a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, release=1755695350)
Oct 02 08:02:45 compute-0 podman[205101]: openstack_network_exporter
Oct 02 08:02:45 compute-0 systemd[1]: Started openstack_network_exporter container.
Oct 02 08:02:45 compute-0 sudo[205054]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:45 compute-0 podman[205129]: 2025-10-02 08:02:45.64098485 +0000 UTC m=+0.094979976 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Oct 02 08:02:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:02:45.961 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:02:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:02:45.961 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:02:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:02:45.962 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:02:46 compute-0 sudo[205299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sukfmzdiqboqmcnxaodmcglifqzcvsdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392165.8313303-1284-234548034167193/AnsiballZ_find.py'
Oct 02 08:02:46 compute-0 sudo[205299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:46 compute-0 python3.9[205301]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 02 08:02:46 compute-0 sudo[205299]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:47 compute-0 sudo[205451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjnbfnmakuftjiccxzuucooyfldrpbef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392167.0055366-1303-185324483429240/AnsiballZ_podman_container_info.py'
Oct 02 08:02:47 compute-0 sudo[205451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:47 compute-0 python3.9[205453]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Oct 02 08:02:47 compute-0 sudo[205451]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:48 compute-0 sudo[205616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjieepyseqyfbisprqdrjrvbdarfviyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392168.140879-1311-171902465150668/AnsiballZ_podman_container_exec.py'
Oct 02 08:02:48 compute-0 sudo[205616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:49 compute-0 python3.9[205618]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 02 08:02:49 compute-0 systemd[1]: Started libpod-conmon-7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6.scope.
Oct 02 08:02:49 compute-0 podman[205619]: 2025-10-02 08:02:49.11532998 +0000 UTC m=+0.094988286 container exec 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 02 08:02:49 compute-0 podman[205619]: 2025-10-02 08:02:49.12754873 +0000 UTC m=+0.107207026 container exec_died 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 02 08:02:49 compute-0 sudo[205616]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:49 compute-0 systemd[1]: libpod-conmon-7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6.scope: Deactivated successfully.
Oct 02 08:02:49 compute-0 sudo[205801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otxuwsyskzildqqrmbnhzkbzuaokrkft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392169.433114-1319-112994070173233/AnsiballZ_podman_container_exec.py'
Oct 02 08:02:49 compute-0 sudo[205801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:50 compute-0 python3.9[205803]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 02 08:02:50 compute-0 systemd[1]: Started libpod-conmon-7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6.scope.
Oct 02 08:02:50 compute-0 podman[205804]: 2025-10-02 08:02:50.198282228 +0000 UTC m=+0.093895472 container exec 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 02 08:02:50 compute-0 podman[205804]: 2025-10-02 08:02:50.233750382 +0000 UTC m=+0.129363576 container exec_died 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:02:50 compute-0 systemd[1]: libpod-conmon-7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6.scope: Deactivated successfully.
Oct 02 08:02:50 compute-0 sudo[205801]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:50 compute-0 sudo[205985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btlxpmczdtvvtlmyxxibypfmpypsklef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392170.4952123-1327-156355784373397/AnsiballZ_file.py'
Oct 02 08:02:50 compute-0 sudo[205985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:51 compute-0 python3.9[205987]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:51 compute-0 sudo[205985]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:51 compute-0 sudo[206137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuoinomlspuwrondjfkehqwdlxfhwiok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392171.4216542-1336-258400815413077/AnsiballZ_podman_container_info.py'
Oct 02 08:02:51 compute-0 sudo[206137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:52 compute-0 python3.9[206139]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Oct 02 08:02:52 compute-0 sudo[206137]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:52 compute-0 sudo[206303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcixshkuumyxmfkfyqydhqwrokfcrftj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392172.4078507-1344-169017280334357/AnsiballZ_podman_container_exec.py'
Oct 02 08:02:52 compute-0 sudo[206303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:53 compute-0 python3.9[206305]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 02 08:02:53 compute-0 systemd[1]: Started libpod-conmon-66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa.scope.
Oct 02 08:02:53 compute-0 podman[206306]: 2025-10-02 08:02:53.137523998 +0000 UTC m=+0.103932896 container exec 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct 02 08:02:53 compute-0 podman[206306]: 2025-10-02 08:02:53.173737034 +0000 UTC m=+0.140145872 container exec_died 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:02:53 compute-0 systemd[1]: libpod-conmon-66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa.scope: Deactivated successfully.
Oct 02 08:02:53 compute-0 sudo[206303]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:53 compute-0 sudo[206487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tewizfjhwspigtpozxassvyzkvkjnyra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392173.4893315-1352-230593395491919/AnsiballZ_podman_container_exec.py'
Oct 02 08:02:53 compute-0 sudo[206487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:54 compute-0 python3.9[206489]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 02 08:02:54 compute-0 systemd[1]: Started libpod-conmon-66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa.scope.
Oct 02 08:02:54 compute-0 podman[206490]: 2025-10-02 08:02:54.273892628 +0000 UTC m=+0.107173827 container exec 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible)
Oct 02 08:02:54 compute-0 podman[206490]: 2025-10-02 08:02:54.305224902 +0000 UTC m=+0.138506091 container exec_died 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 02 08:02:54 compute-0 sudo[206487]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:54 compute-0 podman[206507]: 2025-10-02 08:02:54.355174427 +0000 UTC m=+0.079398641 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:02:54 compute-0 systemd[1]: libpod-conmon-66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa.scope: Deactivated successfully.
Oct 02 08:02:54 compute-0 podman[206509]: 2025-10-02 08:02:54.384427837 +0000 UTC m=+0.096352349 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct 02 08:02:54 compute-0 podman[206508]: 2025-10-02 08:02:54.396330888 +0000 UTC m=+0.121501753 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 02 08:02:54 compute-0 sudo[206731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shckkeysggukubrqnidnqmipobwxlmga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392174.5479145-1360-6305986031424/AnsiballZ_file.py'
Oct 02 08:02:54 compute-0 sudo[206731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:55 compute-0 python3.9[206733]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:55 compute-0 sudo[206731]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:55 compute-0 sudo[206883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chuzdwbakqqgpqljoslopabhznfvcylk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392175.3891075-1369-14358903537726/AnsiballZ_podman_container_info.py'
Oct 02 08:02:55 compute-0 sudo[206883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:55 compute-0 python3.9[206885]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Oct 02 08:02:56 compute-0 sudo[206883]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:56 compute-0 podman[206899]: 2025-10-02 08:02:56.180417333 +0000 UTC m=+0.086791192 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:02:56 compute-0 sudo[207068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eytxnvfosjrxgspvpsprenfavzmgcfmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392176.2653527-1377-101337717298143/AnsiballZ_podman_container_exec.py'
Oct 02 08:02:56 compute-0 sudo[207068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:56 compute-0 python3.9[207070]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 02 08:02:57 compute-0 systemd[1]: Started libpod-conmon-c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0.scope.
Oct 02 08:02:57 compute-0 podman[207071]: 2025-10-02 08:02:57.021872075 +0000 UTC m=+0.106940358 container exec c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid)
Oct 02 08:02:57 compute-0 podman[207071]: 2025-10-02 08:02:57.032657081 +0000 UTC m=+0.117725314 container exec_died c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct 02 08:02:57 compute-0 sudo[207068]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:57 compute-0 systemd[1]: libpod-conmon-c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0.scope: Deactivated successfully.
Oct 02 08:02:57 compute-0 sudo[207253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shjtuttbhwxlewzhpxssdhlnglknkbcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392177.341159-1385-276770873721818/AnsiballZ_podman_container_exec.py'
Oct 02 08:02:57 compute-0 sudo[207253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:57 compute-0 python3.9[207255]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 02 08:02:58 compute-0 systemd[1]: Started libpod-conmon-c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0.scope.
Oct 02 08:02:58 compute-0 podman[207256]: 2025-10-02 08:02:58.078012249 +0000 UTC m=+0.104769021 container exec c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:02:58 compute-0 podman[207256]: 2025-10-02 08:02:58.110883532 +0000 UTC m=+0.137640284 container exec_died c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Oct 02 08:02:58 compute-0 systemd[1]: libpod-conmon-c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0.scope: Deactivated successfully.
Oct 02 08:02:58 compute-0 sudo[207253]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:58 compute-0 sudo[207438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkiadpjdznsictyksreguyhvlrkcbyjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392178.407142-1393-150441850778765/AnsiballZ_file.py'
Oct 02 08:02:58 compute-0 sudo[207438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:59 compute-0 python3.9[207440]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:02:59 compute-0 sudo[207438]: pam_unix(sudo:session): session closed for user root
Oct 02 08:02:59 compute-0 sudo[207590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfjaddwygvbiycpyhotemntlknnfjghh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392179.3157363-1402-159713460674340/AnsiballZ_podman_container_info.py'
Oct 02 08:02:59 compute-0 sudo[207590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:02:59 compute-0 python3.9[207592]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Oct 02 08:03:00 compute-0 sudo[207590]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:00 compute-0 sudo[207771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nujgurqgblzctuuiaqojglwtawntaigk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392180.3492908-1410-125989200434639/AnsiballZ_podman_container_exec.py'
Oct 02 08:03:00 compute-0 sudo[207771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:00 compute-0 podman[207730]: 2025-10-02 08:03:00.755299897 +0000 UTC m=+0.068367348 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:03:00 compute-0 python3.9[207781]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 02 08:03:01 compute-0 systemd[1]: Started libpod-conmon-bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b.scope.
Oct 02 08:03:01 compute-0 podman[207782]: 2025-10-02 08:03:01.053718914 +0000 UTC m=+0.100954634 container exec bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:03:01 compute-0 podman[207782]: 2025-10-02 08:03:01.063550259 +0000 UTC m=+0.110785999 container exec_died bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible)
Oct 02 08:03:01 compute-0 sudo[207771]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:01 compute-0 systemd[1]: libpod-conmon-bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b.scope: Deactivated successfully.
Oct 02 08:03:01 compute-0 sudo[207962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeunndedulhpllltuchnxwzmwondlieg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392181.2949297-1418-136619986440474/AnsiballZ_podman_container_exec.py'
Oct 02 08:03:01 compute-0 sudo[207962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:01 compute-0 python3.9[207964]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 02 08:03:01 compute-0 systemd[1]: Started libpod-conmon-bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b.scope.
Oct 02 08:03:01 compute-0 podman[207965]: 2025-10-02 08:03:01.942944293 +0000 UTC m=+0.100654833 container exec bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 02 08:03:01 compute-0 podman[207965]: 2025-10-02 08:03:01.979358776 +0000 UTC m=+0.137069296 container exec_died bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:03:02 compute-0 systemd[1]: libpod-conmon-bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b.scope: Deactivated successfully.
Oct 02 08:03:02 compute-0 sudo[207962]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:02 compute-0 sudo[208144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsrnblwemigagraijghxjsrhbvzkijxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392182.2895467-1426-104955921795859/AnsiballZ_file.py'
Oct 02 08:03:02 compute-0 sudo[208144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:02 compute-0 python3.9[208146]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:02 compute-0 sudo[208144]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:03 compute-0 sudo[208296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcywaqdvdqypedhtnvwsmjwwdsaakvqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392183.2250986-1435-251399853635069/AnsiballZ_podman_container_info.py'
Oct 02 08:03:03 compute-0 sudo[208296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:03 compute-0 python3.9[208298]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Oct 02 08:03:03 compute-0 sudo[208296]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:04 compute-0 sudo[208461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfdlucixiuyvueituygsayksqenrpmaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392184.1529455-1443-137087349804855/AnsiballZ_podman_container_exec.py'
Oct 02 08:03:04 compute-0 sudo[208461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:04 compute-0 python3.9[208463]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 02 08:03:04 compute-0 systemd[1]: Started libpod-conmon-922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3.scope.
Oct 02 08:03:04 compute-0 podman[208464]: 2025-10-02 08:03:04.863560142 +0000 UTC m=+0.108726594 container exec 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:03:04 compute-0 podman[208464]: 2025-10-02 08:03:04.897736387 +0000 UTC m=+0.142902799 container exec_died 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:03:04 compute-0 systemd[1]: libpod-conmon-922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3.scope: Deactivated successfully.
Oct 02 08:03:04 compute-0 sudo[208461]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:05 compute-0 sudo[208645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbixzvlejiosdmrxkwajjedlbzemkmbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392185.187828-1451-247374668168823/AnsiballZ_podman_container_exec.py'
Oct 02 08:03:05 compute-0 sudo[208645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:05 compute-0 python3.9[208647]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 02 08:03:05 compute-0 systemd[1]: Started libpod-conmon-922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3.scope.
Oct 02 08:03:05 compute-0 podman[208648]: 2025-10-02 08:03:05.996472984 +0000 UTC m=+0.095073358 container exec 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:03:06 compute-0 podman[208648]: 2025-10-02 08:03:06.030619008 +0000 UTC m=+0.129219332 container exec_died 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:03:06 compute-0 systemd[1]: libpod-conmon-922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3.scope: Deactivated successfully.
Oct 02 08:03:06 compute-0 sudo[208645]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:06 compute-0 sudo[208830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lntvrabwiirshdkkdcrvhpqygtswtnka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392186.3359516-1459-1935656148142/AnsiballZ_file.py'
Oct 02 08:03:06 compute-0 sudo[208830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:06 compute-0 python3.9[208832]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:06 compute-0 sudo[208830]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:07 compute-0 sudo[208982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcufdbfutinwetjownpsrqkfzxotymvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392187.2343225-1468-167669423455598/AnsiballZ_podman_container_info.py'
Oct 02 08:03:07 compute-0 sudo[208982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:07 compute-0 python3.9[208984]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Oct 02 08:03:07 compute-0 sudo[208982]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:08 compute-0 sudo[209147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnxghgyrxnogrovkelmwawduyroxaots ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392188.2022672-1476-200436773549576/AnsiballZ_podman_container_exec.py'
Oct 02 08:03:08 compute-0 sudo[209147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:08 compute-0 python3.9[209149]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 02 08:03:08 compute-0 systemd[1]: Started libpod-conmon-a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9.scope.
Oct 02 08:03:08 compute-0 podman[209150]: 2025-10-02 08:03:08.95145992 +0000 UTC m=+0.074599833 container exec a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 02 08:03:08 compute-0 podman[209150]: 2025-10-02 08:03:08.986596023 +0000 UTC m=+0.109735926 container exec_died a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 02 08:03:09 compute-0 systemd[1]: libpod-conmon-a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9.scope: Deactivated successfully.
Oct 02 08:03:09 compute-0 sudo[209147]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:09 compute-0 sudo[209330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzibgqgrxatkiypvpmtdwsmriciotdne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392189.2266772-1484-272815470211946/AnsiballZ_podman_container_exec.py'
Oct 02 08:03:09 compute-0 sudo[209330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:09 compute-0 python3.9[209332]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 02 08:03:10 compute-0 systemd[1]: Started libpod-conmon-a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9.scope.
Oct 02 08:03:10 compute-0 podman[209333]: 2025-10-02 08:03:10.019968445 +0000 UTC m=+0.090783941 container exec a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, version=9.6, distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:03:10 compute-0 podman[209333]: 2025-10-02 08:03:10.0506941 +0000 UTC m=+0.121509296 container exec_died a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Oct 02 08:03:10 compute-0 systemd[1]: libpod-conmon-a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9.scope: Deactivated successfully.
Oct 02 08:03:10 compute-0 sudo[209330]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:10 compute-0 sudo[209512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzwbftsppmwdrbcanhrykxnpzovccrkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392190.3349004-1492-178603261261074/AnsiballZ_file.py'
Oct 02 08:03:10 compute-0 sudo[209512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:10 compute-0 python3.9[209514]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:10 compute-0 sudo[209512]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.002 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.004 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.025 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.026 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.026 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.046 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.047 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.660 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.660 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.661 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.661 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.902 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.903 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6051MB free_disk=73.50031280517578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.904 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.904 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.991 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:03:15 compute-0 nova_compute[192567]: 2025-10-02 08:03:15.992 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:03:16 compute-0 nova_compute[192567]: 2025-10-02 08:03:16.024 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:03:16 compute-0 nova_compute[192567]: 2025-10-02 08:03:16.051 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:03:16 compute-0 nova_compute[192567]: 2025-10-02 08:03:16.054 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:03:16 compute-0 nova_compute[192567]: 2025-10-02 08:03:16.055 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:03:16 compute-0 podman[209539]: 2025-10-02 08:03:16.232134382 +0000 UTC m=+0.129718093 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Oct 02 08:03:17 compute-0 nova_compute[192567]: 2025-10-02 08:03:17.056 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:03:17 compute-0 nova_compute[192567]: 2025-10-02 08:03:17.057 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:03:17 compute-0 nova_compute[192567]: 2025-10-02 08:03:17.057 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:03:17 compute-0 nova_compute[192567]: 2025-10-02 08:03:17.058 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:03:17 compute-0 nova_compute[192567]: 2025-10-02 08:03:17.058 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:03:17 compute-0 nova_compute[192567]: 2025-10-02 08:03:17.058 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:03:25 compute-0 podman[209562]: 2025-10-02 08:03:25.178031433 +0000 UTC m=+0.076077639 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Oct 02 08:03:25 compute-0 podman[209560]: 2025-10-02 08:03:25.19066642 +0000 UTC m=+0.091337829 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 02 08:03:25 compute-0 podman[209561]: 2025-10-02 08:03:25.263916669 +0000 UTC m=+0.158686092 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct 02 08:03:27 compute-0 podman[209625]: 2025-10-02 08:03:27.184606659 +0000 UTC m=+0.091805164 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:03:31 compute-0 podman[209645]: 2025-10-02 08:03:31.149866416 +0000 UTC m=+0.067251993 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:03:34 compute-0 sudo[209794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcogqyryyitnvrxjphkoojjtortnnsau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392214.6367395-1700-50768661819404/AnsiballZ_file.py'
Oct 02 08:03:34 compute-0 sudo[209794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:35 compute-0 python3.9[209796]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:35 compute-0 sudo[209794]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:35 compute-0 sudo[209946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptiscddbdpjhewqertdcvygvcigtipmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392215.4900107-1716-108512315450437/AnsiballZ_stat.py'
Oct 02 08:03:35 compute-0 sudo[209946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:36 compute-0 python3.9[209948]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:03:36 compute-0 sudo[209946]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:36 compute-0 sudo[210069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkgejryapaqrncznhwygflqwsjfhfeiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392215.4900107-1716-108512315450437/AnsiballZ_copy.py'
Oct 02 08:03:36 compute-0 sudo[210069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:36 compute-0 python3.9[210071]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759392215.4900107-1716-108512315450437/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:36 compute-0 sudo[210069]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:37 compute-0 sudo[210221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvtwhmijbuvasqesfmpgcvftyodxharr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392217.175058-1748-11249113749009/AnsiballZ_file.py'
Oct 02 08:03:37 compute-0 sudo[210221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:37 compute-0 python3.9[210223]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:37 compute-0 sudo[210221]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:38 compute-0 sudo[210373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpqmtydhwwngtrkdmjlhhtrhnhiuygig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392218.078713-1764-164645844536396/AnsiballZ_stat.py'
Oct 02 08:03:38 compute-0 sudo[210373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:38 compute-0 python3.9[210375]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:03:38 compute-0 sudo[210373]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:39 compute-0 sudo[210451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uorzvkdupniquhxzvjhzevvzathrsjcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392218.078713-1764-164645844536396/AnsiballZ_file.py'
Oct 02 08:03:39 compute-0 sudo[210451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:39 compute-0 python3.9[210453]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:39 compute-0 sudo[210451]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:39 compute-0 sudo[210603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfftazvvkwyumbfvuffwxsoetvduiecm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392219.5605335-1788-126557729298624/AnsiballZ_stat.py'
Oct 02 08:03:39 compute-0 sudo[210603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:40 compute-0 python3.9[210605]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:03:40 compute-0 sudo[210603]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:40 compute-0 sudo[210681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmovyeofwsnuootinmvhzkitvshmgpae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392219.5605335-1788-126557729298624/AnsiballZ_file.py'
Oct 02 08:03:40 compute-0 sudo[210681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:40 compute-0 python3.9[210683]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.tm43l576 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:40 compute-0 sudo[210681]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:41 compute-0 sudo[210833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klkgfvpdztxyjiyrjlxzvtrcvgctpxjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392220.996773-1812-168069578876222/AnsiballZ_stat.py'
Oct 02 08:03:41 compute-0 sudo[210833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:41 compute-0 python3.9[210835]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:03:41 compute-0 sudo[210833]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:41 compute-0 sudo[210911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkxkjuyaxliranulswdcdzjukriigxso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392220.996773-1812-168069578876222/AnsiballZ_file.py'
Oct 02 08:03:41 compute-0 sudo[210911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:42 compute-0 python3.9[210913]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:42 compute-0 sudo[210911]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:42 compute-0 sudo[211063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efxtomyanmeeiecwhneqrdlyzhuqddkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392222.462889-1838-55690010804772/AnsiballZ_command.py'
Oct 02 08:03:42 compute-0 sudo[211063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:43 compute-0 python3.9[211065]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:03:43 compute-0 sudo[211063]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:43 compute-0 sudo[211216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npwaiftdehwhfzyntzlnbjqwxwizpqzj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759392223.2749321-1854-271101375676949/AnsiballZ_edpm_nftables_from_files.py'
Oct 02 08:03:43 compute-0 sudo[211216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:44 compute-0 python3[211218]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 02 08:03:44 compute-0 sudo[211216]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:44 compute-0 sudo[211368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixsavlupjfvgvszgtodrkjhkaporsmpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392224.4485002-1870-231085099171175/AnsiballZ_stat.py'
Oct 02 08:03:44 compute-0 sudo[211368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:45 compute-0 python3.9[211370]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:03:45 compute-0 sudo[211368]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:45 compute-0 sudo[211446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuoiyikhqgetfxjdwslagbtmjbxnhmns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392224.4485002-1870-231085099171175/AnsiballZ_file.py'
Oct 02 08:03:45 compute-0 sudo[211446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:45 compute-0 python3.9[211448]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:45 compute-0 sudo[211446]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:03:45.962 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:03:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:03:45.962 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:03:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:03:45.963 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:03:46 compute-0 sudo[211614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prprlolnbfcwlrgnnmccfhvbbveltgno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392225.972222-1894-272493751130210/AnsiballZ_stat.py'
Oct 02 08:03:46 compute-0 sudo[211614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:46 compute-0 podman[211572]: 2025-10-02 08:03:46.445709716 +0000 UTC m=+0.092676190 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:03:46 compute-0 python3.9[211619]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:03:46 compute-0 sudo[211614]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:46 compute-0 sudo[211698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnkzgcytrktexjezygtrhqmgrvawavlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392225.972222-1894-272493751130210/AnsiballZ_file.py'
Oct 02 08:03:46 compute-0 sudo[211698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:47 compute-0 python3.9[211700]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:47 compute-0 sudo[211698]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:47 compute-0 sudo[211850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fujnvhxyqigbdkkxporvnbxiqdypbcst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392227.4920945-1918-37549779430827/AnsiballZ_stat.py'
Oct 02 08:03:47 compute-0 sudo[211850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:48 compute-0 python3.9[211852]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:03:48 compute-0 sudo[211850]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:48 compute-0 sudo[211928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syzgorwxmfdcgtpylpqsuotltgnktftv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392227.4920945-1918-37549779430827/AnsiballZ_file.py'
Oct 02 08:03:48 compute-0 sudo[211928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:48 compute-0 python3.9[211930]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:48 compute-0 sudo[211928]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:49 compute-0 sudo[212080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuywxqzoweukgvovwzbhzimgopwxuock ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392228.9978747-1942-129009494776203/AnsiballZ_stat.py'
Oct 02 08:03:49 compute-0 sudo[212080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:49 compute-0 python3.9[212082]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:03:49 compute-0 sudo[212080]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:49 compute-0 sudo[212158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gunsylgppzkwrrkdxnprwzclbldlkufd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392228.9978747-1942-129009494776203/AnsiballZ_file.py'
Oct 02 08:03:49 compute-0 sudo[212158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:50 compute-0 python3.9[212160]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:50 compute-0 sudo[212158]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:50 compute-0 sudo[212310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpwjwqgonuevtrglbsikaoiftnufsxib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392230.3583899-1966-112014387907972/AnsiballZ_stat.py'
Oct 02 08:03:50 compute-0 sudo[212310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:51 compute-0 python3.9[212312]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 02 08:03:51 compute-0 sudo[212310]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:51 compute-0 sudo[212435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfykgzgxiejtnijgvxdghchjeamtdknn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392230.3583899-1966-112014387907972/AnsiballZ_copy.py'
Oct 02 08:03:51 compute-0 sudo[212435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:51 compute-0 python3.9[212437]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759392230.3583899-1966-112014387907972/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:51 compute-0 sudo[212435]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:52 compute-0 sudo[212587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlojnjxmxrdgtyfvlayseqshcrvdaluf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392232.0512447-1996-182351818506029/AnsiballZ_file.py'
Oct 02 08:03:52 compute-0 sudo[212587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:52 compute-0 python3.9[212589]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:52 compute-0 sudo[212587]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:53 compute-0 sudo[212739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbroliaeibenogvxtvlompkrbhcoakav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392232.9115207-2012-90182395958332/AnsiballZ_command.py'
Oct 02 08:03:53 compute-0 sudo[212739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:53 compute-0 python3.9[212741]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:03:53 compute-0 sudo[212739]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:54 compute-0 sudo[212894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyrmegltkwbjuxokkgwyhyfpsselnics ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392233.8566232-2028-10193674100002/AnsiballZ_blockinfile.py'
Oct 02 08:03:54 compute-0 sudo[212894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:54 compute-0 python3.9[212896]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:54 compute-0 sudo[212894]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:55 compute-0 sudo[213077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykavrugsiasxudnngouddpmadsdtciln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392234.9501588-2046-195470096606200/AnsiballZ_command.py'
Oct 02 08:03:55 compute-0 sudo[213077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:55 compute-0 podman[213020]: 2025-10-02 08:03:55.39740662 +0000 UTC m=+0.099880676 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:03:55 compute-0 podman[213021]: 2025-10-02 08:03:55.419632988 +0000 UTC m=+0.116684094 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 02 08:03:55 compute-0 podman[213023]: 2025-10-02 08:03:55.452023095 +0000 UTC m=+0.144045643 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 02 08:03:55 compute-0 python3.9[213098]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:03:55 compute-0 sudo[213077]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:56 compute-0 sudo[213257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twjdcbtlfradfsnvqpmjedkrixnlcnge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392235.8430083-2062-24140594155453/AnsiballZ_stat.py'
Oct 02 08:03:56 compute-0 sudo[213257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:56 compute-0 python3.9[213259]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 02 08:03:56 compute-0 sudo[213257]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:57 compute-0 sudo[213411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clscwwawehyvcifrzwbtcdtbswvocrim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392236.8005042-2078-189325044584140/AnsiballZ_command.py'
Oct 02 08:03:57 compute-0 sudo[213411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:57 compute-0 podman[213413]: 2025-10-02 08:03:57.343650822 +0000 UTC m=+0.083345478 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 02 08:03:57 compute-0 python3.9[213414]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 02 08:03:57 compute-0 sudo[213411]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:58 compute-0 sudo[213587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gireikvpdwilgdfkkpanjgoaprrufulj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759392237.7544715-2094-27111286259337/AnsiballZ_file.py'
Oct 02 08:03:58 compute-0 sudo[213587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:03:58 compute-0 python3.9[213589]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 02 08:03:58 compute-0 sudo[213587]: pam_unix(sudo:session): session closed for user root
Oct 02 08:03:58 compute-0 sshd-session[192866]: Connection closed by 192.168.122.30 port 45122
Oct 02 08:03:58 compute-0 sshd-session[192863]: pam_unix(sshd:session): session closed for user zuul
Oct 02 08:03:58 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Oct 02 08:03:58 compute-0 systemd[1]: session-28.scope: Consumed 1min 44.349s CPU time.
Oct 02 08:03:58 compute-0 systemd-logind[827]: Session 28 logged out. Waiting for processes to exit.
Oct 02 08:03:58 compute-0 systemd-logind[827]: Removed session 28.
Oct 02 08:03:59 compute-0 podman[203011]: time="2025-10-02T08:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:03:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:03:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2968 "" "Go-http-client/1.1"
Oct 02 08:04:01 compute-0 openstack_network_exporter[205118]: ERROR   08:04:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:04:01 compute-0 openstack_network_exporter[205118]: ERROR   08:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:04:01 compute-0 openstack_network_exporter[205118]: ERROR   08:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:04:01 compute-0 openstack_network_exporter[205118]: ERROR   08:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:04:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:04:01 compute-0 openstack_network_exporter[205118]: ERROR   08:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:04:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:04:02 compute-0 podman[213618]: 2025-10-02 08:04:02.176210907 +0000 UTC m=+0.085430733 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:04:14 compute-0 nova_compute[192567]: 2025-10-02 08:04:14.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.651 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.651 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.652 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.652 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.879 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.881 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6120MB free_disk=73.50032806396484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.882 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.882 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.945 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.946 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.973 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.987 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.989 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:04:15 compute-0 nova_compute[192567]: 2025-10-02 08:04:15.990 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:04:16 compute-0 nova_compute[192567]: 2025-10-02 08:04:16.990 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:04:16 compute-0 nova_compute[192567]: 2025-10-02 08:04:16.990 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:04:16 compute-0 nova_compute[192567]: 2025-10-02 08:04:16.991 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:04:17 compute-0 nova_compute[192567]: 2025-10-02 08:04:17.014 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:04:17 compute-0 nova_compute[192567]: 2025-10-02 08:04:17.014 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:04:17 compute-0 podman[213642]: 2025-10-02 08:04:17.185328097 +0000 UTC m=+0.097409313 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, version=9.6, distribution-scope=public)
Oct 02 08:04:17 compute-0 nova_compute[192567]: 2025-10-02 08:04:17.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:04:17 compute-0 nova_compute[192567]: 2025-10-02 08:04:17.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:04:18 compute-0 nova_compute[192567]: 2025-10-02 08:04:18.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:04:18 compute-0 nova_compute[192567]: 2025-10-02 08:04:18.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:04:18 compute-0 nova_compute[192567]: 2025-10-02 08:04:18.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:04:18 compute-0 nova_compute[192567]: 2025-10-02 08:04:18.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:04:26 compute-0 podman[213663]: 2025-10-02 08:04:26.175191077 +0000 UTC m=+0.080572509 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 02 08:04:26 compute-0 podman[213665]: 2025-10-02 08:04:26.202501997 +0000 UTC m=+0.096185774 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:04:26 compute-0 podman[213664]: 2025-10-02 08:04:26.268241514 +0000 UTC m=+0.174923936 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Oct 02 08:04:28 compute-0 podman[213729]: 2025-10-02 08:04:28.184939398 +0000 UTC m=+0.087310489 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:04:29 compute-0 podman[203011]: time="2025-10-02T08:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:04:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:04:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2978 "" "Go-http-client/1.1"
Oct 02 08:04:31 compute-0 openstack_network_exporter[205118]: ERROR   08:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:04:31 compute-0 openstack_network_exporter[205118]: ERROR   08:04:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:04:31 compute-0 openstack_network_exporter[205118]: ERROR   08:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:04:31 compute-0 openstack_network_exporter[205118]: ERROR   08:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:04:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:04:31 compute-0 openstack_network_exporter[205118]: ERROR   08:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:04:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:04:33 compute-0 podman[213753]: 2025-10-02 08:04:33.152864042 +0000 UTC m=+0.068106742 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:04:42 compute-0 PackageKit[131059]: daemon quit
Oct 02 08:04:42 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 02 08:04:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:04:45.962 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:04:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:04:45.963 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:04:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:04:45.963 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:04:48 compute-0 podman[213777]: 2025-10-02 08:04:48.16674544 +0000 UTC m=+0.074298453 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Oct 02 08:04:57 compute-0 podman[213799]: 2025-10-02 08:04:57.186908995 +0000 UTC m=+0.097109473 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 02 08:04:57 compute-0 podman[213801]: 2025-10-02 08:04:57.195901945 +0000 UTC m=+0.103041168 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 02 08:04:57 compute-0 podman[213800]: 2025-10-02 08:04:57.239463421 +0000 UTC m=+0.148548305 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:04:59 compute-0 podman[213865]: 2025-10-02 08:04:59.183953062 +0000 UTC m=+0.092411369 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 02 08:05:01 compute-0 openstack_network_exporter[205118]: ERROR   08:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:05:01 compute-0 openstack_network_exporter[205118]: ERROR   08:05:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:05:01 compute-0 openstack_network_exporter[205118]: ERROR   08:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:05:01 compute-0 openstack_network_exporter[205118]: ERROR   08:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:05:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:05:01 compute-0 openstack_network_exporter[205118]: ERROR   08:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:05:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:05:04 compute-0 podman[213886]: 2025-10-02 08:05:04.142342558 +0000 UTC m=+0.062038732 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:05:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:05:06.091 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:05:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:05:06.092 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:05:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:05:06.093 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:05:08 compute-0 unix_chkpwd[213912]: password check failed for user (root)
Oct 02 08:05:08 compute-0 sshd-session[213910]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Oct 02 08:05:11 compute-0 sshd-session[213910]: Failed password for root from 80.94.93.233 port 49164 ssh2
Oct 02 08:05:12 compute-0 unix_chkpwd[213913]: password check failed for user (root)
Oct 02 08:05:14 compute-0 nova_compute[192567]: 2025-10-02 08:05:14.621 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:05:14 compute-0 sshd-session[213910]: Failed password for root from 80.94.93.233 port 49164 ssh2
Oct 02 08:05:14 compute-0 unix_chkpwd[213914]: password check failed for user (root)
Oct 02 08:05:15 compute-0 nova_compute[192567]: 2025-10-02 08:05:15.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:05:15 compute-0 nova_compute[192567]: 2025-10-02 08:05:15.643 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:05:15 compute-0 nova_compute[192567]: 2025-10-02 08:05:15.665 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:05:15 compute-0 nova_compute[192567]: 2025-10-02 08:05:15.666 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:05:15 compute-0 nova_compute[192567]: 2025-10-02 08:05:15.666 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:05:15 compute-0 nova_compute[192567]: 2025-10-02 08:05:15.666 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:05:15 compute-0 nova_compute[192567]: 2025-10-02 08:05:15.915 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:05:15 compute-0 nova_compute[192567]: 2025-10-02 08:05:15.917 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6168MB free_disk=73.50375366210938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:05:15 compute-0 nova_compute[192567]: 2025-10-02 08:05:15.917 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:05:15 compute-0 nova_compute[192567]: 2025-10-02 08:05:15.918 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:05:15 compute-0 nova_compute[192567]: 2025-10-02 08:05:15.987 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:05:15 compute-0 nova_compute[192567]: 2025-10-02 08:05:15.987 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:05:16 compute-0 nova_compute[192567]: 2025-10-02 08:05:16.014 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:05:16 compute-0 nova_compute[192567]: 2025-10-02 08:05:16.039 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:05:16 compute-0 nova_compute[192567]: 2025-10-02 08:05:16.041 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:05:16 compute-0 nova_compute[192567]: 2025-10-02 08:05:16.042 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:05:16 compute-0 sshd-session[213910]: Failed password for root from 80.94.93.233 port 49164 ssh2
Oct 02 08:05:17 compute-0 nova_compute[192567]: 2025-10-02 08:05:17.023 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:05:17 compute-0 sshd-session[213910]: Received disconnect from 80.94.93.233 port 49164:11:  [preauth]
Oct 02 08:05:17 compute-0 sshd-session[213910]: Disconnected from authenticating user root 80.94.93.233 port 49164 [preauth]
Oct 02 08:05:17 compute-0 sshd-session[213910]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Oct 02 08:05:18 compute-0 nova_compute[192567]: 2025-10-02 08:05:18.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:05:18 compute-0 nova_compute[192567]: 2025-10-02 08:05:18.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:05:18 compute-0 nova_compute[192567]: 2025-10-02 08:05:18.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:05:18 compute-0 nova_compute[192567]: 2025-10-02 08:05:18.643 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:05:18 compute-0 nova_compute[192567]: 2025-10-02 08:05:18.644 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:05:18 compute-0 unix_chkpwd[213917]: password check failed for user (root)
Oct 02 08:05:18 compute-0 sshd-session[213915]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Oct 02 08:05:19 compute-0 podman[213918]: 2025-10-02 08:05:19.184505977 +0000 UTC m=+0.092679574 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Oct 02 08:05:19 compute-0 nova_compute[192567]: 2025-10-02 08:05:19.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:05:19 compute-0 nova_compute[192567]: 2025-10-02 08:05:19.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:05:19 compute-0 nova_compute[192567]: 2025-10-02 08:05:19.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:05:19 compute-0 nova_compute[192567]: 2025-10-02 08:05:19.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:05:20 compute-0 sshd-session[213915]: Failed password for root from 80.94.93.233 port 46980 ssh2
Oct 02 08:05:20 compute-0 nova_compute[192567]: 2025-10-02 08:05:20.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:05:21 compute-0 unix_chkpwd[213940]: password check failed for user (root)
Oct 02 08:05:22 compute-0 sshd-session[213915]: Failed password for root from 80.94.93.233 port 46980 ssh2
Oct 02 08:05:23 compute-0 unix_chkpwd[213941]: password check failed for user (root)
Oct 02 08:05:25 compute-0 sshd-session[213915]: Failed password for root from 80.94.93.233 port 46980 ssh2
Oct 02 08:05:25 compute-0 sshd-session[213915]: Received disconnect from 80.94.93.233 port 46980:11:  [preauth]
Oct 02 08:05:25 compute-0 sshd-session[213915]: Disconnected from authenticating user root 80.94.93.233 port 46980 [preauth]
Oct 02 08:05:25 compute-0 sshd-session[213915]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Oct 02 08:05:26 compute-0 unix_chkpwd[213944]: password check failed for user (root)
Oct 02 08:05:26 compute-0 sshd-session[213942]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Oct 02 08:05:28 compute-0 podman[213945]: 2025-10-02 08:05:28.172513958 +0000 UTC m=+0.084143637 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct 02 08:05:28 compute-0 podman[213947]: 2025-10-02 08:05:28.176669057 +0000 UTC m=+0.081737611 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:05:28 compute-0 podman[213946]: 2025-10-02 08:05:28.20976914 +0000 UTC m=+0.123692231 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:05:28 compute-0 sshd-session[213942]: Failed password for root from 80.94.93.233 port 28496 ssh2
Oct 02 08:05:29 compute-0 podman[203011]: time="2025-10-02T08:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:05:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:05:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2978 "" "Go-http-client/1.1"
Oct 02 08:05:30 compute-0 podman[214006]: 2025-10-02 08:05:30.184716088 +0000 UTC m=+0.099208747 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 02 08:05:30 compute-0 unix_chkpwd[214027]: password check failed for user (root)
Oct 02 08:05:31 compute-0 openstack_network_exporter[205118]: ERROR   08:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:05:31 compute-0 openstack_network_exporter[205118]: ERROR   08:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:05:31 compute-0 openstack_network_exporter[205118]: ERROR   08:05:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:05:31 compute-0 openstack_network_exporter[205118]: ERROR   08:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:05:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:05:31 compute-0 openstack_network_exporter[205118]: ERROR   08:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:05:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:05:32 compute-0 sshd-session[213942]: Failed password for root from 80.94.93.233 port 28496 ssh2
Oct 02 08:05:34 compute-0 unix_chkpwd[214028]: password check failed for user (root)
Oct 02 08:05:35 compute-0 podman[214029]: 2025-10-02 08:05:35.165503701 +0000 UTC m=+0.080565915 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:05:36 compute-0 sshd-session[213942]: Failed password for root from 80.94.93.233 port 28496 ssh2
Oct 02 08:05:36 compute-0 sshd-session[213942]: Received disconnect from 80.94.93.233 port 28496:11:  [preauth]
Oct 02 08:05:36 compute-0 sshd-session[213942]: Disconnected from authenticating user root 80.94.93.233 port 28496 [preauth]
Oct 02 08:05:36 compute-0 sshd-session[213942]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Oct 02 08:05:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:05:45.964 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:05:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:05:45.970 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:05:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:05:45.971 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:05:50 compute-0 podman[214053]: 2025-10-02 08:05:50.175733613 +0000 UTC m=+0.074844857 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Oct 02 08:05:59 compute-0 podman[214076]: 2025-10-02 08:05:59.169558344 +0000 UTC m=+0.073828745 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 02 08:05:59 compute-0 podman[214078]: 2025-10-02 08:05:59.196178904 +0000 UTC m=+0.089251826 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:05:59 compute-0 podman[214077]: 2025-10-02 08:05:59.238467784 +0000 UTC m=+0.132844807 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:05:59 compute-0 podman[203011]: time="2025-10-02T08:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:05:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:05:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:05:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2980 "" "Go-http-client/1.1"
Oct 02 08:06:01 compute-0 podman[214143]: 2025-10-02 08:06:01.22201071 +0000 UTC m=+0.128569383 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:06:01 compute-0 openstack_network_exporter[205118]: ERROR   08:06:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:06:01 compute-0 openstack_network_exporter[205118]: ERROR   08:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:06:01 compute-0 openstack_network_exporter[205118]: ERROR   08:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:06:01 compute-0 openstack_network_exporter[205118]: ERROR   08:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:06:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:06:01 compute-0 openstack_network_exporter[205118]: ERROR   08:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:06:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:06:06 compute-0 podman[214163]: 2025-10-02 08:06:06.161336869 +0000 UTC m=+0.073391711 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:06:14 compute-0 nova_compute[192567]: 2025-10-02 08:06:14.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:06:14 compute-0 nova_compute[192567]: 2025-10-02 08:06:14.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 02 08:06:14 compute-0 nova_compute[192567]: 2025-10-02 08:06:14.638 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 02 08:06:14 compute-0 nova_compute[192567]: 2025-10-02 08:06:14.639 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:06:14 compute-0 nova_compute[192567]: 2025-10-02 08:06:14.640 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 02 08:06:14 compute-0 nova_compute[192567]: 2025-10-02 08:06:14.672 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:06:15 compute-0 nova_compute[192567]: 2025-10-02 08:06:15.701 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:06:15 compute-0 nova_compute[192567]: 2025-10-02 08:06:15.729 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:06:15 compute-0 nova_compute[192567]: 2025-10-02 08:06:15.730 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:06:15 compute-0 nova_compute[192567]: 2025-10-02 08:06:15.730 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:06:15 compute-0 nova_compute[192567]: 2025-10-02 08:06:15.731 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:06:15 compute-0 nova_compute[192567]: 2025-10-02 08:06:15.972 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:06:15 compute-0 nova_compute[192567]: 2025-10-02 08:06:15.974 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6197MB free_disk=73.50374984741211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:06:15 compute-0 nova_compute[192567]: 2025-10-02 08:06:15.974 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:06:15 compute-0 nova_compute[192567]: 2025-10-02 08:06:15.975 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:06:16 compute-0 nova_compute[192567]: 2025-10-02 08:06:16.138 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:06:16 compute-0 nova_compute[192567]: 2025-10-02 08:06:16.139 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:06:16 compute-0 nova_compute[192567]: 2025-10-02 08:06:16.289 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:06:16 compute-0 nova_compute[192567]: 2025-10-02 08:06:16.327 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:06:16 compute-0 nova_compute[192567]: 2025-10-02 08:06:16.329 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:06:16 compute-0 nova_compute[192567]: 2025-10-02 08:06:16.330 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:06:17 compute-0 nova_compute[192567]: 2025-10-02 08:06:17.248 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:06:17 compute-0 nova_compute[192567]: 2025-10-02 08:06:17.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:06:18 compute-0 nova_compute[192567]: 2025-10-02 08:06:18.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:06:20 compute-0 nova_compute[192567]: 2025-10-02 08:06:20.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:06:20 compute-0 nova_compute[192567]: 2025-10-02 08:06:20.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:06:20 compute-0 nova_compute[192567]: 2025-10-02 08:06:20.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:06:20 compute-0 nova_compute[192567]: 2025-10-02 08:06:20.640 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:06:20 compute-0 nova_compute[192567]: 2025-10-02 08:06:20.640 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:06:21 compute-0 podman[214188]: 2025-10-02 08:06:21.188745489 +0000 UTC m=+0.096328371 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Oct 02 08:06:21 compute-0 nova_compute[192567]: 2025-10-02 08:06:21.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:06:21 compute-0 nova_compute[192567]: 2025-10-02 08:06:21.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:06:21 compute-0 nova_compute[192567]: 2025-10-02 08:06:21.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:06:22 compute-0 nova_compute[192567]: 2025-10-02 08:06:22.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:06:29 compute-0 podman[203011]: time="2025-10-02T08:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:06:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:06:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Oct 02 08:06:30 compute-0 podman[214211]: 2025-10-02 08:06:30.200827183 +0000 UTC m=+0.108166133 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 02 08:06:30 compute-0 podman[214212]: 2025-10-02 08:06:30.213136068 +0000 UTC m=+0.111998746 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 02 08:06:30 compute-0 podman[214213]: 2025-10-02 08:06:30.232220805 +0000 UTC m=+0.128318021 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd)
Oct 02 08:06:31 compute-0 openstack_network_exporter[205118]: ERROR   08:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:06:31 compute-0 openstack_network_exporter[205118]: ERROR   08:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:06:31 compute-0 openstack_network_exporter[205118]: ERROR   08:06:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:06:31 compute-0 openstack_network_exporter[205118]: ERROR   08:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:06:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:06:31 compute-0 openstack_network_exporter[205118]: ERROR   08:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:06:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:06:32 compute-0 podman[214278]: 2025-10-02 08:06:32.187651967 +0000 UTC m=+0.097399284 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Oct 02 08:06:37 compute-0 podman[214298]: 2025-10-02 08:06:37.175412687 +0000 UTC m=+0.084932583 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:06:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:06:45.964 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:06:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:06:45.964 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:06:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:06:45.965 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:06:52 compute-0 podman[214324]: 2025-10-02 08:06:52.185870766 +0000 UTC m=+0.094595299 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, managed_by=edpm_ansible, architecture=x86_64, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41)
Oct 02 08:06:59 compute-0 podman[203011]: time="2025-10-02T08:06:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:06:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:06:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:06:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:06:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2980 "" "Go-http-client/1.1"
Oct 02 08:07:01 compute-0 podman[214347]: 2025-10-02 08:07:01.176182902 +0000 UTC m=+0.073764642 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 02 08:07:01 compute-0 podman[214345]: 2025-10-02 08:07:01.200583296 +0000 UTC m=+0.107702519 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 02 08:07:01 compute-0 podman[214346]: 2025-10-02 08:07:01.214099917 +0000 UTC m=+0.114219973 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:07:01 compute-0 openstack_network_exporter[205118]: ERROR   08:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:07:01 compute-0 openstack_network_exporter[205118]: ERROR   08:07:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:07:01 compute-0 openstack_network_exporter[205118]: ERROR   08:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:07:01 compute-0 openstack_network_exporter[205118]: ERROR   08:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:07:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:07:01 compute-0 openstack_network_exporter[205118]: ERROR   08:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:07:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:07:03 compute-0 podman[214406]: 2025-10-02 08:07:03.173730384 +0000 UTC m=+0.086106428 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:07:08 compute-0 podman[214426]: 2025-10-02 08:07:08.18998035 +0000 UTC m=+0.096741815 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:07:15 compute-0 nova_compute[192567]: 2025-10-02 08:07:15.628 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:07:15 compute-0 nova_compute[192567]: 2025-10-02 08:07:15.663 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:07:15 compute-0 nova_compute[192567]: 2025-10-02 08:07:15.663 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:07:15 compute-0 nova_compute[192567]: 2025-10-02 08:07:15.664 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:07:15 compute-0 nova_compute[192567]: 2025-10-02 08:07:15.664 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:07:15 compute-0 nova_compute[192567]: 2025-10-02 08:07:15.903 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:07:15 compute-0 nova_compute[192567]: 2025-10-02 08:07:15.905 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6195MB free_disk=73.5040397644043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:07:15 compute-0 nova_compute[192567]: 2025-10-02 08:07:15.905 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:07:15 compute-0 nova_compute[192567]: 2025-10-02 08:07:15.906 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:07:15 compute-0 nova_compute[192567]: 2025-10-02 08:07:15.973 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:07:15 compute-0 nova_compute[192567]: 2025-10-02 08:07:15.974 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:07:16 compute-0 nova_compute[192567]: 2025-10-02 08:07:16.002 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing inventories for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 02 08:07:16 compute-0 nova_compute[192567]: 2025-10-02 08:07:16.080 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating ProviderTree inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 02 08:07:16 compute-0 nova_compute[192567]: 2025-10-02 08:07:16.081 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:07:16 compute-0 nova_compute[192567]: 2025-10-02 08:07:16.103 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing aggregate associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 02 08:07:16 compute-0 nova_compute[192567]: 2025-10-02 08:07:16.119 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing trait associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 02 08:07:16 compute-0 nova_compute[192567]: 2025-10-02 08:07:16.140 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:07:16 compute-0 nova_compute[192567]: 2025-10-02 08:07:16.158 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:07:16 compute-0 nova_compute[192567]: 2025-10-02 08:07:16.161 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:07:16 compute-0 nova_compute[192567]: 2025-10-02 08:07:16.162 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:07:17 compute-0 nova_compute[192567]: 2025-10-02 08:07:17.157 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:07:18 compute-0 nova_compute[192567]: 2025-10-02 08:07:18.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:07:19 compute-0 nova_compute[192567]: 2025-10-02 08:07:19.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:07:20 compute-0 nova_compute[192567]: 2025-10-02 08:07:20.621 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:07:20 compute-0 nova_compute[192567]: 2025-10-02 08:07:20.641 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:07:20 compute-0 nova_compute[192567]: 2025-10-02 08:07:20.642 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:07:20 compute-0 nova_compute[192567]: 2025-10-02 08:07:20.642 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:07:20 compute-0 nova_compute[192567]: 2025-10-02 08:07:20.655 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:07:20 compute-0 nova_compute[192567]: 2025-10-02 08:07:20.655 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:07:22 compute-0 nova_compute[192567]: 2025-10-02 08:07:22.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:07:22 compute-0 nova_compute[192567]: 2025-10-02 08:07:22.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:07:22 compute-0 nova_compute[192567]: 2025-10-02 08:07:22.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:07:23 compute-0 podman[214451]: 2025-10-02 08:07:23.180729366 +0000 UTC m=+0.090388825 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:07:24 compute-0 nova_compute[192567]: 2025-10-02 08:07:24.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:07:29 compute-0 podman[203011]: time="2025-10-02T08:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:07:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:07:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2977 "" "Go-http-client/1.1"
Oct 02 08:07:31 compute-0 openstack_network_exporter[205118]: ERROR   08:07:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:07:31 compute-0 openstack_network_exporter[205118]: ERROR   08:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:07:31 compute-0 openstack_network_exporter[205118]: ERROR   08:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:07:31 compute-0 openstack_network_exporter[205118]: ERROR   08:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:07:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:07:31 compute-0 openstack_network_exporter[205118]: ERROR   08:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:07:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:07:32 compute-0 podman[214474]: 2025-10-02 08:07:32.167915088 +0000 UTC m=+0.074449891 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct 02 08:07:32 compute-0 podman[214476]: 2025-10-02 08:07:32.179113951 +0000 UTC m=+0.072860384 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 02 08:07:32 compute-0 podman[214475]: 2025-10-02 08:07:32.231977171 +0000 UTC m=+0.127842898 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 02 08:07:34 compute-0 podman[214537]: 2025-10-02 08:07:34.144107697 +0000 UTC m=+0.066215817 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 02 08:07:39 compute-0 podman[214558]: 2025-10-02 08:07:39.16567003 +0000 UTC m=+0.075544304 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:07:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:07:45.964 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:07:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:07:45.965 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:07:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:07:45.965 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:07:54 compute-0 podman[214582]: 2025-10-02 08:07:54.177317954 +0000 UTC m=+0.089630292 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Oct 02 08:07:59 compute-0 podman[203011]: time="2025-10-02T08:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:07:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:07:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2980 "" "Go-http-client/1.1"
Oct 02 08:08:01 compute-0 openstack_network_exporter[205118]: ERROR   08:08:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:08:01 compute-0 openstack_network_exporter[205118]: ERROR   08:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:08:01 compute-0 openstack_network_exporter[205118]: ERROR   08:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:08:01 compute-0 openstack_network_exporter[205118]: ERROR   08:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:08:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:08:01 compute-0 openstack_network_exporter[205118]: ERROR   08:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:08:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:08:03 compute-0 podman[214605]: 2025-10-02 08:08:03.212504478 +0000 UTC m=+0.103716473 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:08:03 compute-0 podman[214603]: 2025-10-02 08:08:03.213507069 +0000 UTC m=+0.116639454 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 02 08:08:03 compute-0 podman[214604]: 2025-10-02 08:08:03.224340415 +0000 UTC m=+0.128213763 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:08:05 compute-0 podman[214666]: 2025-10-02 08:08:05.165636046 +0000 UTC m=+0.079276104 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:08:10 compute-0 podman[214686]: 2025-10-02 08:08:10.164425025 +0000 UTC m=+0.074413922 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:08:16 compute-0 nova_compute[192567]: 2025-10-02 08:08:16.635 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:08:16 compute-0 nova_compute[192567]: 2025-10-02 08:08:16.669 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:08:16 compute-0 nova_compute[192567]: 2025-10-02 08:08:16.670 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:08:16 compute-0 nova_compute[192567]: 2025-10-02 08:08:16.671 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:08:16 compute-0 nova_compute[192567]: 2025-10-02 08:08:16.671 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:08:16 compute-0 nova_compute[192567]: 2025-10-02 08:08:16.936 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:08:16 compute-0 nova_compute[192567]: 2025-10-02 08:08:16.939 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6200MB free_disk=73.5040397644043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:08:16 compute-0 nova_compute[192567]: 2025-10-02 08:08:16.939 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:08:16 compute-0 nova_compute[192567]: 2025-10-02 08:08:16.940 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:08:17 compute-0 nova_compute[192567]: 2025-10-02 08:08:17.027 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:08:17 compute-0 nova_compute[192567]: 2025-10-02 08:08:17.028 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:08:17 compute-0 nova_compute[192567]: 2025-10-02 08:08:17.068 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:08:17 compute-0 nova_compute[192567]: 2025-10-02 08:08:17.086 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:08:17 compute-0 nova_compute[192567]: 2025-10-02 08:08:17.088 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:08:17 compute-0 nova_compute[192567]: 2025-10-02 08:08:17.089 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:08:18 compute-0 nova_compute[192567]: 2025-10-02 08:08:18.075 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:08:19 compute-0 nova_compute[192567]: 2025-10-02 08:08:19.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:08:19 compute-0 nova_compute[192567]: 2025-10-02 08:08:19.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:08:20 compute-0 nova_compute[192567]: 2025-10-02 08:08:20.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:08:21 compute-0 nova_compute[192567]: 2025-10-02 08:08:21.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:08:21 compute-0 nova_compute[192567]: 2025-10-02 08:08:21.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:08:21 compute-0 nova_compute[192567]: 2025-10-02 08:08:21.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:08:21 compute-0 nova_compute[192567]: 2025-10-02 08:08:21.644 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:08:22 compute-0 nova_compute[192567]: 2025-10-02 08:08:22.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:08:22 compute-0 nova_compute[192567]: 2025-10-02 08:08:22.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:08:23 compute-0 nova_compute[192567]: 2025-10-02 08:08:23.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:08:24 compute-0 nova_compute[192567]: 2025-10-02 08:08:24.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:08:25 compute-0 podman[214710]: 2025-10-02 08:08:25.181221518 +0000 UTC m=+0.082249205 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:08:29 compute-0 podman[203011]: time="2025-10-02T08:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:08:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:08:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Oct 02 08:08:31 compute-0 openstack_network_exporter[205118]: ERROR   08:08:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:08:31 compute-0 openstack_network_exporter[205118]: ERROR   08:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:08:31 compute-0 openstack_network_exporter[205118]: ERROR   08:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:08:31 compute-0 openstack_network_exporter[205118]: ERROR   08:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:08:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:08:31 compute-0 openstack_network_exporter[205118]: ERROR   08:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:08:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:08:34 compute-0 podman[214733]: 2025-10-02 08:08:34.214758806 +0000 UTC m=+0.101243372 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 02 08:08:34 compute-0 podman[214732]: 2025-10-02 08:08:34.214788407 +0000 UTC m=+0.118981706 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:08:34 compute-0 podman[214731]: 2025-10-02 08:08:34.218311147 +0000 UTC m=+0.122069612 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:08:36 compute-0 podman[214796]: 2025-10-02 08:08:36.180669924 +0000 UTC m=+0.088812903 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:08:41 compute-0 podman[214816]: 2025-10-02 08:08:41.189590807 +0000 UTC m=+0.092839379 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:08:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:08:45.722 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:08:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:08:45.723 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:08:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:08:45.965 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:08:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:08:45.966 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:08:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:08:45.966 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:08:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:08:53.727 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:08:56 compute-0 podman[214840]: 2025-10-02 08:08:56.191647175 +0000 UTC m=+0.096778112 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Oct 02 08:08:59 compute-0 podman[203011]: time="2025-10-02T08:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:08:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:08:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Oct 02 08:09:01 compute-0 openstack_network_exporter[205118]: ERROR   08:09:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:09:01 compute-0 openstack_network_exporter[205118]: ERROR   08:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:09:01 compute-0 openstack_network_exporter[205118]: ERROR   08:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:09:01 compute-0 openstack_network_exporter[205118]: ERROR   08:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:09:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:09:01 compute-0 openstack_network_exporter[205118]: ERROR   08:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:09:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:09:05 compute-0 podman[214862]: 2025-10-02 08:09:05.218690128 +0000 UTC m=+0.114904408 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 02 08:09:05 compute-0 podman[214864]: 2025-10-02 08:09:05.223932201 +0000 UTC m=+0.108120636 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:09:05 compute-0 podman[214863]: 2025-10-02 08:09:05.258534892 +0000 UTC m=+0.148471576 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:09:07 compute-0 podman[214926]: 2025-10-02 08:09:07.213931241 +0000 UTC m=+0.110291525 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid)
Oct 02 08:09:12 compute-0 podman[214948]: 2025-10-02 08:09:12.167772114 +0000 UTC m=+0.075939903 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:09:17 compute-0 nova_compute[192567]: 2025-10-02 08:09:17.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:09:17 compute-0 nova_compute[192567]: 2025-10-02 08:09:17.653 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:17 compute-0 nova_compute[192567]: 2025-10-02 08:09:17.653 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:17 compute-0 nova_compute[192567]: 2025-10-02 08:09:17.654 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:17 compute-0 nova_compute[192567]: 2025-10-02 08:09:17.654 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:09:17 compute-0 nova_compute[192567]: 2025-10-02 08:09:17.854 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:09:17 compute-0 nova_compute[192567]: 2025-10-02 08:09:17.856 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6194MB free_disk=73.5040397644043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:09:17 compute-0 nova_compute[192567]: 2025-10-02 08:09:17.856 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:17 compute-0 nova_compute[192567]: 2025-10-02 08:09:17.857 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:17 compute-0 nova_compute[192567]: 2025-10-02 08:09:17.956 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:09:17 compute-0 nova_compute[192567]: 2025-10-02 08:09:17.957 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:09:17 compute-0 nova_compute[192567]: 2025-10-02 08:09:17.988 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:09:18 compute-0 nova_compute[192567]: 2025-10-02 08:09:18.020 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:09:18 compute-0 nova_compute[192567]: 2025-10-02 08:09:18.024 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:09:18 compute-0 nova_compute[192567]: 2025-10-02 08:09:18.024 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:20 compute-0 nova_compute[192567]: 2025-10-02 08:09:20.022 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:09:20 compute-0 nova_compute[192567]: 2025-10-02 08:09:20.023 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:09:20 compute-0 nova_compute[192567]: 2025-10-02 08:09:20.023 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:09:21 compute-0 nova_compute[192567]: 2025-10-02 08:09:21.628 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:09:21 compute-0 nova_compute[192567]: 2025-10-02 08:09:21.629 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:09:21 compute-0 nova_compute[192567]: 2025-10-02 08:09:21.629 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:09:21 compute-0 nova_compute[192567]: 2025-10-02 08:09:21.651 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:09:21 compute-0 nova_compute[192567]: 2025-10-02 08:09:21.652 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:09:22 compute-0 nova_compute[192567]: 2025-10-02 08:09:22.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:09:22 compute-0 nova_compute[192567]: 2025-10-02 08:09:22.626 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:09:23 compute-0 nova_compute[192567]: 2025-10-02 08:09:23.622 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:09:25 compute-0 nova_compute[192567]: 2025-10-02 08:09:25.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:09:26 compute-0 nova_compute[192567]: 2025-10-02 08:09:26.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:09:27 compute-0 podman[214973]: 2025-10-02 08:09:27.180853155 +0000 UTC m=+0.092008313 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Oct 02 08:09:28 compute-0 sshd-session[214994]: Connection closed by 106.36.198.78 port 34560
Oct 02 08:09:29 compute-0 podman[203011]: time="2025-10-02T08:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:09:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:09:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2985 "" "Go-http-client/1.1"
Oct 02 08:09:31 compute-0 openstack_network_exporter[205118]: ERROR   08:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:09:31 compute-0 openstack_network_exporter[205118]: ERROR   08:09:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:09:31 compute-0 openstack_network_exporter[205118]: ERROR   08:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:09:31 compute-0 openstack_network_exporter[205118]: ERROR   08:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:09:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:09:31 compute-0 openstack_network_exporter[205118]: ERROR   08:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:09:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:09:36 compute-0 podman[214995]: 2025-10-02 08:09:36.199623041 +0000 UTC m=+0.107757616 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:09:36 compute-0 podman[214997]: 2025-10-02 08:09:36.199845788 +0000 UTC m=+0.100108528 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Oct 02 08:09:36 compute-0 podman[214996]: 2025-10-02 08:09:36.219631895 +0000 UTC m=+0.124631772 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Oct 02 08:09:38 compute-0 podman[215059]: 2025-10-02 08:09:38.179922008 +0000 UTC m=+0.089847677 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 02 08:09:40 compute-0 nova_compute[192567]: 2025-10-02 08:09:40.734 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "2e661e5f-2462-4ffd-99a7-afc83d45f425" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:40 compute-0 nova_compute[192567]: 2025-10-02 08:09:40.734 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:40 compute-0 nova_compute[192567]: 2025-10-02 08:09:40.749 2 DEBUG nova.compute.manager [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.061 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.062 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.076 2 DEBUG nova.virt.hardware [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.077 2 INFO nova.compute.claims [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.255 2 DEBUG nova.compute.provider_tree [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.287 2 DEBUG nova.scheduler.client.report [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.314 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.315 2 DEBUG nova.compute.manager [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.377 2 DEBUG nova.compute.manager [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.377 2 DEBUG nova.network.neutron [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.399 2 INFO nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.417 2 DEBUG nova.compute.manager [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.570 2 DEBUG nova.compute.manager [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.572 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.573 2 INFO nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Creating image(s)
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.574 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "/var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.574 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "/var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.575 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "/var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.576 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:41 compute-0 nova_compute[192567]: 2025-10-02 08:09:41.577 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:42 compute-0 nova_compute[192567]: 2025-10-02 08:09:42.081 2 WARNING oslo_policy.policy [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Oct 02 08:09:42 compute-0 nova_compute[192567]: 2025-10-02 08:09:42.081 2 WARNING oslo_policy.policy [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Oct 02 08:09:42 compute-0 nova_compute[192567]: 2025-10-02 08:09:42.613 2 DEBUG nova.network.neutron [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Successfully created port: 782354d7-2469-4521-9850-4777d41a0047 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 02 08:09:43 compute-0 podman[215079]: 2025-10-02 08:09:43.186401805 +0000 UTC m=+0.096236565 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:09:43 compute-0 nova_compute[192567]: 2025-10-02 08:09:43.447 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:09:43 compute-0 nova_compute[192567]: 2025-10-02 08:09:43.502 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e.part --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:09:43 compute-0 nova_compute[192567]: 2025-10-02 08:09:43.504 2 DEBUG nova.virt.images [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] f5cf0efc-6f3c-4865-b002-490e9c9b250d was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Oct 02 08:09:43 compute-0 nova_compute[192567]: 2025-10-02 08:09:43.506 2 DEBUG nova.privsep.utils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Oct 02 08:09:43 compute-0 nova_compute[192567]: 2025-10-02 08:09:43.507 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e.part /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:09:43 compute-0 nova_compute[192567]: 2025-10-02 08:09:43.713 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e.part /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e.converted" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:09:43 compute-0 nova_compute[192567]: 2025-10-02 08:09:43.722 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:09:43 compute-0 nova_compute[192567]: 2025-10-02 08:09:43.785 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e.converted --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:09:43 compute-0 nova_compute[192567]: 2025-10-02 08:09:43.787 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:43 compute-0 nova_compute[192567]: 2025-10-02 08:09:43.811 2 INFO oslo.privsep.daemon [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp86rg6w5p/privsep.sock']
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.545 2 INFO oslo.privsep.daemon [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Spawned new privsep daemon via rootwrap
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.383 57 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.391 57 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.395 57 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.395 57 INFO oslo.privsep.daemon [-] privsep daemon running as pid 57
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.628 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.707 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.710 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.711 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.735 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.800 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.802 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.851 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.852 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.853 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.932 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.933 2 DEBUG nova.virt.disk.api [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Checking if we can resize image /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.934 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.996 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.997 2 DEBUG nova.virt.disk.api [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Cannot resize image /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:09:44 compute-0 nova_compute[192567]: 2025-10-02 08:09:44.998 2 DEBUG nova.objects.instance [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lazy-loading 'migration_context' on Instance uuid 2e661e5f-2462-4ffd-99a7-afc83d45f425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:09:45 compute-0 nova_compute[192567]: 2025-10-02 08:09:45.437 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 02 08:09:45 compute-0 nova_compute[192567]: 2025-10-02 08:09:45.438 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Ensure instance console log exists: /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:09:45 compute-0 nova_compute[192567]: 2025-10-02 08:09:45.439 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:45 compute-0 nova_compute[192567]: 2025-10-02 08:09:45.440 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:45 compute-0 nova_compute[192567]: 2025-10-02 08:09:45.440 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:45 compute-0 nova_compute[192567]: 2025-10-02 08:09:45.749 2 DEBUG nova.network.neutron [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Successfully updated port: 782354d7-2469-4521-9850-4777d41a0047 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 02 08:09:45 compute-0 nova_compute[192567]: 2025-10-02 08:09:45.953 2 DEBUG nova.compute.manager [req-4ec59444-0270-4c0d-9b4d-756930cd9d60 req-f569a888-c153-49a5-9403-f689c294ac78 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Received event network-changed-782354d7-2469-4521-9850-4777d41a0047 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:09:45 compute-0 nova_compute[192567]: 2025-10-02 08:09:45.953 2 DEBUG nova.compute.manager [req-4ec59444-0270-4c0d-9b4d-756930cd9d60 req-f569a888-c153-49a5-9403-f689c294ac78 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Refreshing instance network info cache due to event network-changed-782354d7-2469-4521-9850-4777d41a0047. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:09:45 compute-0 nova_compute[192567]: 2025-10-02 08:09:45.954 2 DEBUG oslo_concurrency.lockutils [req-4ec59444-0270-4c0d-9b4d-756930cd9d60 req-f569a888-c153-49a5-9403-f689c294ac78 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-2e661e5f-2462-4ffd-99a7-afc83d45f425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:09:45 compute-0 nova_compute[192567]: 2025-10-02 08:09:45.954 2 DEBUG oslo_concurrency.lockutils [req-4ec59444-0270-4c0d-9b4d-756930cd9d60 req-f569a888-c153-49a5-9403-f689c294ac78 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-2e661e5f-2462-4ffd-99a7-afc83d45f425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:09:45 compute-0 nova_compute[192567]: 2025-10-02 08:09:45.954 2 DEBUG nova.network.neutron [req-4ec59444-0270-4c0d-9b4d-756930cd9d60 req-f569a888-c153-49a5-9403-f689c294ac78 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Refreshing network info cache for port 782354d7-2469-4521-9850-4777d41a0047 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:09:45 compute-0 nova_compute[192567]: 2025-10-02 08:09:45.959 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "refresh_cache-2e661e5f-2462-4ffd-99a7-afc83d45f425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:09:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:45.966 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:45.968 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:45.968 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:46 compute-0 nova_compute[192567]: 2025-10-02 08:09:46.149 2 DEBUG nova.network.neutron [req-4ec59444-0270-4c0d-9b4d-756930cd9d60 req-f569a888-c153-49a5-9403-f689c294ac78 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:09:46 compute-0 nova_compute[192567]: 2025-10-02 08:09:46.886 2 DEBUG nova.network.neutron [req-4ec59444-0270-4c0d-9b4d-756930cd9d60 req-f569a888-c153-49a5-9403-f689c294ac78 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:09:46 compute-0 nova_compute[192567]: 2025-10-02 08:09:46.908 2 DEBUG oslo_concurrency.lockutils [req-4ec59444-0270-4c0d-9b4d-756930cd9d60 req-f569a888-c153-49a5-9403-f689c294ac78 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-2e661e5f-2462-4ffd-99a7-afc83d45f425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:09:46 compute-0 nova_compute[192567]: 2025-10-02 08:09:46.910 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquired lock "refresh_cache-2e661e5f-2462-4ffd-99a7-afc83d45f425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:09:46 compute-0 nova_compute[192567]: 2025-10-02 08:09:46.910 2 DEBUG nova.network.neutron [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:09:47 compute-0 nova_compute[192567]: 2025-10-02 08:09:47.071 2 DEBUG nova.network.neutron [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.050 2 DEBUG nova.network.neutron [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Updating instance_info_cache with network_info: [{"id": "782354d7-2469-4521-9850-4777d41a0047", "address": "fa:16:3e:6c:f4:5e", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap782354d7-24", "ovs_interfaceid": "782354d7-2469-4521-9850-4777d41a0047", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.070 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Releasing lock "refresh_cache-2e661e5f-2462-4ffd-99a7-afc83d45f425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.071 2 DEBUG nova.compute.manager [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Instance network_info: |[{"id": "782354d7-2469-4521-9850-4777d41a0047", "address": "fa:16:3e:6c:f4:5e", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap782354d7-24", "ovs_interfaceid": "782354d7-2469-4521-9850-4777d41a0047", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.078 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Start _get_guest_xml network_info=[{"id": "782354d7-2469-4521-9850-4777d41a0047", "address": "fa:16:3e:6c:f4:5e", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap782354d7-24", "ovs_interfaceid": "782354d7-2469-4521-9850-4777d41a0047", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.086 2 WARNING nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.091 2 DEBUG nova.virt.libvirt.host [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.092 2 DEBUG nova.virt.libvirt.host [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.096 2 DEBUG nova.virt.libvirt.host [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.096 2 DEBUG nova.virt.libvirt.host [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.097 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.098 2 DEBUG nova.virt.hardware [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.099 2 DEBUG nova.virt.hardware [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.099 2 DEBUG nova.virt.hardware [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.099 2 DEBUG nova.virt.hardware [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.100 2 DEBUG nova.virt.hardware [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.100 2 DEBUG nova.virt.hardware [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.101 2 DEBUG nova.virt.hardware [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.101 2 DEBUG nova.virt.hardware [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.102 2 DEBUG nova.virt.hardware [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.102 2 DEBUG nova.virt.hardware [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.102 2 DEBUG nova.virt.hardware [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.109 2 DEBUG nova.privsep.utils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.111 2 DEBUG nova.virt.libvirt.vif [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:09:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1330242329',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1330242329',id=2,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-ftj0v7vg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:09:41Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=2e661e5f-2462-4ffd-99a7-afc83d45f425,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "782354d7-2469-4521-9850-4777d41a0047", "address": "fa:16:3e:6c:f4:5e", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap782354d7-24", "ovs_interfaceid": "782354d7-2469-4521-9850-4777d41a0047", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.111 2 DEBUG nova.network.os_vif_util [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converting VIF {"id": "782354d7-2469-4521-9850-4777d41a0047", "address": "fa:16:3e:6c:f4:5e", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap782354d7-24", "ovs_interfaceid": "782354d7-2469-4521-9850-4777d41a0047", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.113 2 DEBUG nova.network.os_vif_util [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:f4:5e,bridge_name='br-int',has_traffic_filtering=True,id=782354d7-2469-4521-9850-4777d41a0047,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap782354d7-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.116 2 DEBUG nova.objects.instance [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2e661e5f-2462-4ffd-99a7-afc83d45f425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.129 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:09:48 compute-0 nova_compute[192567]:   <uuid>2e661e5f-2462-4ffd-99a7-afc83d45f425</uuid>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   <name>instance-00000002</name>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1330242329</nova:name>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:09:48</nova:creationTime>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:09:48 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:09:48 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:09:48 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:09:48 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:09:48 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:09:48 compute-0 nova_compute[192567]:         <nova:user uuid="4b5c71b386a34e829eef47bf613d813c">tempest-TestExecuteActionsViaActuator-547955480-project-admin</nova:user>
Oct 02 08:09:48 compute-0 nova_compute[192567]:         <nova:project uuid="a5d6400b4e3f4d98a7456330f6429bd5">tempest-TestExecuteActionsViaActuator-547955480</nova:project>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:09:48 compute-0 nova_compute[192567]:         <nova:port uuid="782354d7-2469-4521-9850-4777d41a0047">
Oct 02 08:09:48 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <system>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <entry name="serial">2e661e5f-2462-4ffd-99a7-afc83d45f425</entry>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <entry name="uuid">2e661e5f-2462-4ffd-99a7-afc83d45f425</entry>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     </system>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   <os>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   </os>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   <features>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   </features>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk.config"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:6c:f4:5e"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <target dev="tap782354d7-24"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/console.log" append="off"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <video>
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     </video>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:09:48 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:09:48 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:09:48 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:09:48 compute-0 nova_compute[192567]: </domain>
Oct 02 08:09:48 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.131 2 DEBUG nova.compute.manager [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Preparing to wait for external event network-vif-plugged-782354d7-2469-4521-9850-4777d41a0047 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.132 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.132 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.132 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.133 2 DEBUG nova.virt.libvirt.vif [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:09:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1330242329',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1330242329',id=2,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-ftj0v7vg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:09:41Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=2e661e5f-2462-4ffd-99a7-afc83d45f425,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "782354d7-2469-4521-9850-4777d41a0047", "address": "fa:16:3e:6c:f4:5e", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap782354d7-24", "ovs_interfaceid": "782354d7-2469-4521-9850-4777d41a0047", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.133 2 DEBUG nova.network.os_vif_util [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converting VIF {"id": "782354d7-2469-4521-9850-4777d41a0047", "address": "fa:16:3e:6c:f4:5e", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap782354d7-24", "ovs_interfaceid": "782354d7-2469-4521-9850-4777d41a0047", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.134 2 DEBUG nova.network.os_vif_util [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:f4:5e,bridge_name='br-int',has_traffic_filtering=True,id=782354d7-2469-4521-9850-4777d41a0047,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap782354d7-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.134 2 DEBUG os_vif [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:f4:5e,bridge_name='br-int',has_traffic_filtering=True,id=782354d7-2469-4521-9850-4777d41a0047,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap782354d7-24') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.237 2 DEBUG ovsdbapp.backend.ovs_idl [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.238 2 DEBUG ovsdbapp.backend.ovs_idl [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.238 2 DEBUG ovsdbapp.backend.ovs_idl [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.266 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.267 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.268 2 INFO oslo.privsep.daemon [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpx97p0rmq/privsep.sock']
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.984 2 INFO oslo.privsep.daemon [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Spawned new privsep daemon via rootwrap
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.869 78 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.877 78 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.881 78 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Oct 02 08:09:48 compute-0 nova_compute[192567]: 2025-10-02 08:09:48.882 78 INFO oslo.privsep.daemon [-] privsep daemon running as pid 78
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.301 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap782354d7-24, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.302 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap782354d7-24, col_values=(('external_ids', {'iface-id': '782354d7-2469-4521-9850-4777d41a0047', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:f4:5e', 'vm-uuid': '2e661e5f-2462-4ffd-99a7-afc83d45f425'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:09:49 compute-0 NetworkManager[51654]: <info>  [1759392589.3493] manager: (tap782354d7-24): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.360 2 INFO os_vif [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:f4:5e,bridge_name='br-int',has_traffic_filtering=True,id=782354d7-2469-4521-9850-4777d41a0047,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap782354d7-24')
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.497 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.497 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.498 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] No VIF found with MAC fa:16:3e:6c:f4:5e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.498 2 INFO nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Using config drive
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.845 2 INFO nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Creating config drive at /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk.config
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.851 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8c2q9nt_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:49 compute-0 nova_compute[192567]: 2025-10-02 08:09:49.976 2 DEBUG oslo_concurrency.processutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8c2q9nt_" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:09:50 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct 02 08:09:50 compute-0 kernel: tap782354d7-24: entered promiscuous mode
Oct 02 08:09:50 compute-0 nova_compute[192567]: 2025-10-02 08:09:50.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:50 compute-0 ovn_controller[94821]: 2025-10-02T08:09:50Z|00027|binding|INFO|Claiming lport 782354d7-2469-4521-9850-4777d41a0047 for this chassis.
Oct 02 08:09:50 compute-0 ovn_controller[94821]: 2025-10-02T08:09:50Z|00028|binding|INFO|782354d7-2469-4521-9850-4777d41a0047: Claiming fa:16:3e:6c:f4:5e 10.100.0.8
Oct 02 08:09:50 compute-0 NetworkManager[51654]: <info>  [1759392590.0995] manager: (tap782354d7-24): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Oct 02 08:09:50 compute-0 nova_compute[192567]: 2025-10-02 08:09:50.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:50.122 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:f4:5e 10.100.0.8'], port_security=['fa:16:3e:6c:f4:5e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '2e661e5f-2462-4ffd-99a7-afc83d45f425', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5d6400b4e3f4d98a7456330f6429bd5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cba96dbc-c401-4d81-b355-4680d6ad5e15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d06b6f4b-ccde-4903-a1fe-e6bac9f52057, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=782354d7-2469-4521-9850-4777d41a0047) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:09:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:50.125 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 782354d7-2469-4521-9850-4777d41a0047 in datapath 441198e3-04ff-48aa-b8a7-2339e4bb8085 bound to our chassis
Oct 02 08:09:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:50.128 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 441198e3-04ff-48aa-b8a7-2339e4bb8085
Oct 02 08:09:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:50.130 103703 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpn924r8e3/privsep.sock']
Oct 02 08:09:50 compute-0 systemd-udevd[215171]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:09:50 compute-0 NetworkManager[51654]: <info>  [1759392590.1859] device (tap782354d7-24): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:09:50 compute-0 NetworkManager[51654]: <info>  [1759392590.1868] device (tap782354d7-24): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:09:50 compute-0 systemd-machined[152597]: New machine qemu-1-instance-00000002.
Oct 02 08:09:50 compute-0 nova_compute[192567]: 2025-10-02 08:09:50.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:50 compute-0 ovn_controller[94821]: 2025-10-02T08:09:50Z|00029|binding|INFO|Setting lport 782354d7-2469-4521-9850-4777d41a0047 ovn-installed in OVS
Oct 02 08:09:50 compute-0 ovn_controller[94821]: 2025-10-02T08:09:50Z|00030|binding|INFO|Setting lport 782354d7-2469-4521-9850-4777d41a0047 up in Southbound
Oct 02 08:09:50 compute-0 nova_compute[192567]: 2025-10-02 08:09:50.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:50 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Oct 02 08:09:50 compute-0 nova_compute[192567]: 2025-10-02 08:09:50.453 2 DEBUG nova.compute.manager [req-84dbfc1a-0c9a-43e1-9558-aec2b0ebb8ca req-20086b94-0b50-42b9-a150-2cf19b7b6a1b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Received event network-vif-plugged-782354d7-2469-4521-9850-4777d41a0047 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:09:50 compute-0 nova_compute[192567]: 2025-10-02 08:09:50.455 2 DEBUG oslo_concurrency.lockutils [req-84dbfc1a-0c9a-43e1-9558-aec2b0ebb8ca req-20086b94-0b50-42b9-a150-2cf19b7b6a1b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:50 compute-0 nova_compute[192567]: 2025-10-02 08:09:50.457 2 DEBUG oslo_concurrency.lockutils [req-84dbfc1a-0c9a-43e1-9558-aec2b0ebb8ca req-20086b94-0b50-42b9-a150-2cf19b7b6a1b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:50 compute-0 nova_compute[192567]: 2025-10-02 08:09:50.457 2 DEBUG oslo_concurrency.lockutils [req-84dbfc1a-0c9a-43e1-9558-aec2b0ebb8ca req-20086b94-0b50-42b9-a150-2cf19b7b6a1b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:50 compute-0 nova_compute[192567]: 2025-10-02 08:09:50.458 2 DEBUG nova.compute.manager [req-84dbfc1a-0c9a-43e1-9558-aec2b0ebb8ca req-20086b94-0b50-42b9-a150-2cf19b7b6a1b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Processing event network-vif-plugged-782354d7-2469-4521-9850-4777d41a0047 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 02 08:09:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:50.656 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:09:50 compute-0 nova_compute[192567]: 2025-10-02 08:09:50.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:50.872 103703 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 02 08:09:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:50.872 103703 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpn924r8e3/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Oct 02 08:09:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:50.738 215188 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 02 08:09:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:50.744 215188 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 02 08:09:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:50.746 215188 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Oct 02 08:09:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:50.747 215188 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215188
Oct 02 08:09:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:50.876 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[b136d91e-d803-4174-98a1-dd9254f4a1c7]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:51.471 215188 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:51.471 215188 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:51.471 215188 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.729 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392591.7288258, 2e661e5f-2462-4ffd-99a7-afc83d45f425 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.730 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] VM Started (Lifecycle Event)
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.734 2 DEBUG nova.compute.manager [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.753 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.759 2 INFO nova.virt.libvirt.driver [-] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Instance spawned successfully.
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.760 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.769 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.776 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.788 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.789 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.790 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.791 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.792 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.793 2 DEBUG nova.virt.libvirt.driver [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.804 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.805 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392591.7291992, 2e661e5f-2462-4ffd-99a7-afc83d45f425 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.805 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] VM Paused (Lifecycle Event)
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.830 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.834 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392591.7536352, 2e661e5f-2462-4ffd-99a7-afc83d45f425 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.834 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] VM Resumed (Lifecycle Event)
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.869 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.873 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.890 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.898 2 INFO nova.compute.manager [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Took 10.33 seconds to spawn the instance on the hypervisor.
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.899 2 DEBUG nova.compute.manager [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.946 2 INFO nova.compute.manager [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Took 10.93 seconds to build instance.
Oct 02 08:09:51 compute-0 nova_compute[192567]: 2025-10-02 08:09:51.974 2 DEBUG oslo_concurrency.lockutils [None req-527c0856-e9b1-4a66-b48d-6837e01a3ed4 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.063 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[50b4157a-0af9-485e-9fd3-fb92d8e8410b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.064 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap441198e3-01 in ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.067 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap441198e3-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.067 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[68148259-7ccc-4b96-b3f1-83c5f04eb99c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.070 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[02f8d41c-842a-4283-a937-f7e9425993ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.110 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[1b14a37f-3fee-49bc-a4df-441ebbba315c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.141 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[2640e72e-ecc4-4643-9e9f-089213fcd017]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.144 103703 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmplagx8p3i/privsep.sock']
Oct 02 08:09:52 compute-0 nova_compute[192567]: 2025-10-02 08:09:52.573 2 DEBUG nova.compute.manager [req-7cb3c930-43ef-4682-ba3b-7f064661c0b1 req-e4375407-df85-4e0d-82bf-9b3138754a1d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Received event network-vif-plugged-782354d7-2469-4521-9850-4777d41a0047 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:09:52 compute-0 nova_compute[192567]: 2025-10-02 08:09:52.574 2 DEBUG oslo_concurrency.lockutils [req-7cb3c930-43ef-4682-ba3b-7f064661c0b1 req-e4375407-df85-4e0d-82bf-9b3138754a1d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:52 compute-0 nova_compute[192567]: 2025-10-02 08:09:52.574 2 DEBUG oslo_concurrency.lockutils [req-7cb3c930-43ef-4682-ba3b-7f064661c0b1 req-e4375407-df85-4e0d-82bf-9b3138754a1d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:52 compute-0 nova_compute[192567]: 2025-10-02 08:09:52.575 2 DEBUG oslo_concurrency.lockutils [req-7cb3c930-43ef-4682-ba3b-7f064661c0b1 req-e4375407-df85-4e0d-82bf-9b3138754a1d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:52 compute-0 nova_compute[192567]: 2025-10-02 08:09:52.575 2 DEBUG nova.compute.manager [req-7cb3c930-43ef-4682-ba3b-7f064661c0b1 req-e4375407-df85-4e0d-82bf-9b3138754a1d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] No waiting events found dispatching network-vif-plugged-782354d7-2469-4521-9850-4777d41a0047 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:09:52 compute-0 nova_compute[192567]: 2025-10-02 08:09:52.576 2 WARNING nova.compute.manager [req-7cb3c930-43ef-4682-ba3b-7f064661c0b1 req-e4375407-df85-4e0d-82bf-9b3138754a1d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Received unexpected event network-vif-plugged-782354d7-2469-4521-9850-4777d41a0047 for instance with vm_state active and task_state None.
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.913 103703 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.915 103703 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmplagx8p3i/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.770 215209 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.779 215209 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.783 215209 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.783 215209 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215209
Oct 02 08:09:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:52.919 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b2bf9e-0b5f-4c2e-9cf1-2ef5715fb67a]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:53.409 215209 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:09:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:53.409 215209 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:09:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:53.409 215209 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:09:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:53.953 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb1d613-4a35-4b88-9f2a-2a1150958207]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:53.965 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[d03d00e2-5507-45e5-83c6-ccc998dffc03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:53 compute-0 NetworkManager[51654]: <info>  [1759392593.9704] manager: (tap441198e3-00): new Veth device (/org/freedesktop/NetworkManager/Devices/24)
Oct 02 08:09:54 compute-0 systemd-udevd[215219]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.011 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[7d24ae63-cf5e-4ead-aa3a-5376c0a72eaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.014 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[7b0a8529-9c29-463d-b9dc-038ecb038368]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:54 compute-0 NetworkManager[51654]: <info>  [1759392594.0547] device (tap441198e3-00): carrier: link connected
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.060 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[c06d50f6-9b0f-4142-9cfe-21e219160e77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.089 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f2ca36ad-2962-4c8d-870c-8accd5469503]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap441198e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:13:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348164, 'reachable_time': 42571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215238, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.112 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[88357d4b-ed97-4132-b811-5fcbeee84261]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9d:13ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348164, 'tstamp': 348164}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215239, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.138 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[700aea05-1cc2-421a-8e2e-b015da43c33b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap441198e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:13:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348164, 'reachable_time': 42571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215240, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.190 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[71d6c0e2-d7a9-4f9c-8142-b543726b8b37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.247 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[650eb88e-1184-4af9-a329-db8a842cc963]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.250 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap441198e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.251 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.252 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap441198e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:09:54 compute-0 nova_compute[192567]: 2025-10-02 08:09:54.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:54 compute-0 kernel: tap441198e3-00: entered promiscuous mode
Oct 02 08:09:54 compute-0 NetworkManager[51654]: <info>  [1759392594.2566] manager: (tap441198e3-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Oct 02 08:09:54 compute-0 nova_compute[192567]: 2025-10-02 08:09:54.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.259 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap441198e3-00, col_values=(('external_ids', {'iface-id': 'f4e4745f-6cb7-4dfe-930a-ab5c5f2db11b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:09:54 compute-0 nova_compute[192567]: 2025-10-02 08:09:54.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:54 compute-0 ovn_controller[94821]: 2025-10-02T08:09:54Z|00031|binding|INFO|Releasing lport f4e4745f-6cb7-4dfe-930a-ab5c5f2db11b from this chassis (sb_readonly=0)
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.264 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/441198e3-04ff-48aa-b8a7-2339e4bb8085.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/441198e3-04ff-48aa-b8a7-2339e4bb8085.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.266 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[28120d1e-fdcb-45ef-ad7b-a3ab9d1941d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.268 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-441198e3-04ff-48aa-b8a7-2339e4bb8085
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/441198e3-04ff-48aa-b8a7-2339e4bb8085.pid.haproxy
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID 441198e3-04ff-48aa-b8a7-2339e4bb8085
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.270 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'env', 'PROCESS_TAG=haproxy-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/441198e3-04ff-48aa-b8a7-2339e4bb8085.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:09:54 compute-0 nova_compute[192567]: 2025-10-02 08:09:54.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:54 compute-0 nova_compute[192567]: 2025-10-02 08:09:54.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:54 compute-0 podman[215273]: 2025-10-02 08:09:54.706860124 +0000 UTC m=+0.070375498 container create 6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct 02 08:09:54 compute-0 podman[215273]: 2025-10-02 08:09:54.670664935 +0000 UTC m=+0.034180329 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:09:54 compute-0 systemd[1]: Started libpod-conmon-6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb.scope.
Oct 02 08:09:54 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:09:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d48ba9d0cf4319b7524e50130d2054a00142945398b8772533cd818497f0464/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:09:54 compute-0 podman[215273]: 2025-10-02 08:09:54.829018798 +0000 UTC m=+0.192534162 container init 6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 02 08:09:54 compute-0 podman[215273]: 2025-10-02 08:09:54.839732403 +0000 UTC m=+0.203247757 container start 6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct 02 08:09:54 compute-0 neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085[215288]: [NOTICE]   (215292) : New worker (215294) forked
Oct 02 08:09:54 compute-0 neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085[215288]: [NOTICE]   (215292) : Loading success.
Oct 02 08:09:54 compute-0 nova_compute[192567]: 2025-10-02 08:09:54.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:54.933 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:09:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:09:56.939 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:09:58 compute-0 podman[215303]: 2025-10-02 08:09:58.246036552 +0000 UTC m=+0.148351673 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, managed_by=edpm_ansible, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=edpm, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 02 08:09:59 compute-0 nova_compute[192567]: 2025-10-02 08:09:59.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:09:59 compute-0 podman[203011]: time="2025-10-02T08:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:09:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:09:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3441 "" "Go-http-client/1.1"
Oct 02 08:09:59 compute-0 nova_compute[192567]: 2025-10-02 08:09:59.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:01 compute-0 openstack_network_exporter[205118]: ERROR   08:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:10:01 compute-0 openstack_network_exporter[205118]: ERROR   08:10:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:10:01 compute-0 openstack_network_exporter[205118]: ERROR   08:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:10:01 compute-0 openstack_network_exporter[205118]: ERROR   08:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:10:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:10:01 compute-0 openstack_network_exporter[205118]: ERROR   08:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:10:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:10:02 compute-0 ovn_controller[94821]: 2025-10-02T08:10:02Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6c:f4:5e 10.100.0.8
Oct 02 08:10:02 compute-0 ovn_controller[94821]: 2025-10-02T08:10:02Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6c:f4:5e 10.100.0.8
Oct 02 08:10:04 compute-0 nova_compute[192567]: 2025-10-02 08:10:04.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:04 compute-0 nova_compute[192567]: 2025-10-02 08:10:04.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:07 compute-0 podman[215339]: 2025-10-02 08:10:07.209027053 +0000 UTC m=+0.109302933 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001)
Oct 02 08:10:07 compute-0 podman[215341]: 2025-10-02 08:10:07.215346951 +0000 UTC m=+0.101805969 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true)
Oct 02 08:10:07 compute-0 podman[215340]: 2025-10-02 08:10:07.295557015 +0000 UTC m=+0.196000570 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct 02 08:10:09 compute-0 podman[215400]: 2025-10-02 08:10:09.193994666 +0000 UTC m=+0.094923125 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001)
Oct 02 08:10:09 compute-0 nova_compute[192567]: 2025-10-02 08:10:09.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:09 compute-0 nova_compute[192567]: 2025-10-02 08:10:09.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:14 compute-0 podman[215420]: 2025-10-02 08:10:14.181816601 +0000 UTC m=+0.088150944 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:10:14 compute-0 nova_compute[192567]: 2025-10-02 08:10:14.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:14 compute-0 nova_compute[192567]: 2025-10-02 08:10:14.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:18 compute-0 nova_compute[192567]: 2025-10-02 08:10:18.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:10:18 compute-0 nova_compute[192567]: 2025-10-02 08:10:18.670 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:10:18 compute-0 nova_compute[192567]: 2025-10-02 08:10:18.672 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:10:18 compute-0 nova_compute[192567]: 2025-10-02 08:10:18.672 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:10:18 compute-0 nova_compute[192567]: 2025-10-02 08:10:18.673 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:10:18 compute-0 nova_compute[192567]: 2025-10-02 08:10:18.887 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:10:18 compute-0 nova_compute[192567]: 2025-10-02 08:10:18.967 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:10:18 compute-0 nova_compute[192567]: 2025-10-02 08:10:18.968 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.054 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.265 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.267 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5730MB free_disk=73.44052505493164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.267 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.268 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.354 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance 2e661e5f-2462-4ffd-99a7-afc83d45f425 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.355 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.355 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.405 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.436 2 ERROR nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [req-3d0bad9d-f3df-4350-9bce-f6cda2a6b810] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID e7f6698e-de2d-4705-8493-a3445ce0cf6e.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-3d0bad9d-f3df-4350-9bce-f6cda2a6b810"}]}
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.454 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing inventories for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.474 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating ProviderTree inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.475 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.491 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing aggregate associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.515 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing trait associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.556 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.617 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updated inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.617 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.618 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.648 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.649 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.381s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:10:19 compute-0 nova_compute[192567]: 2025-10-02 08:10:19.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:21 compute-0 nova_compute[192567]: 2025-10-02 08:10:21.643 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:10:21 compute-0 nova_compute[192567]: 2025-10-02 08:10:21.644 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:10:21 compute-0 nova_compute[192567]: 2025-10-02 08:10:21.644 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:10:22 compute-0 nova_compute[192567]: 2025-10-02 08:10:22.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:10:22 compute-0 nova_compute[192567]: 2025-10-02 08:10:22.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:10:22 compute-0 nova_compute[192567]: 2025-10-02 08:10:22.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:10:22 compute-0 nova_compute[192567]: 2025-10-02 08:10:22.802 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-2e661e5f-2462-4ffd-99a7-afc83d45f425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:10:22 compute-0 nova_compute[192567]: 2025-10-02 08:10:22.803 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-2e661e5f-2462-4ffd-99a7-afc83d45f425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:10:22 compute-0 nova_compute[192567]: 2025-10-02 08:10:22.803 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:10:22 compute-0 nova_compute[192567]: 2025-10-02 08:10:22.803 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2e661e5f-2462-4ffd-99a7-afc83d45f425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:10:23 compute-0 nova_compute[192567]: 2025-10-02 08:10:23.872 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Updating instance_info_cache with network_info: [{"id": "782354d7-2469-4521-9850-4777d41a0047", "address": "fa:16:3e:6c:f4:5e", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap782354d7-24", "ovs_interfaceid": "782354d7-2469-4521-9850-4777d41a0047", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:10:23 compute-0 nova_compute[192567]: 2025-10-02 08:10:23.891 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-2e661e5f-2462-4ffd-99a7-afc83d45f425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:10:23 compute-0 nova_compute[192567]: 2025-10-02 08:10:23.891 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:10:23 compute-0 nova_compute[192567]: 2025-10-02 08:10:23.892 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:10:24 compute-0 ovn_controller[94821]: 2025-10-02T08:10:24Z|00032|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Oct 02 08:10:24 compute-0 nova_compute[192567]: 2025-10-02 08:10:24.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:24 compute-0 nova_compute[192567]: 2025-10-02 08:10:24.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:10:24 compute-0 nova_compute[192567]: 2025-10-02 08:10:24.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:10:24 compute-0 nova_compute[192567]: 2025-10-02 08:10:24.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:26 compute-0 nova_compute[192567]: 2025-10-02 08:10:26.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:10:27 compute-0 nova_compute[192567]: 2025-10-02 08:10:27.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:10:29 compute-0 podman[215452]: 2025-10-02 08:10:29.190323494 +0000 UTC m=+0.088620329 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Oct 02 08:10:29 compute-0 nova_compute[192567]: 2025-10-02 08:10:29.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:29 compute-0 podman[203011]: time="2025-10-02T08:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:10:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:10:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3455 "" "Go-http-client/1.1"
Oct 02 08:10:29 compute-0 nova_compute[192567]: 2025-10-02 08:10:29.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:31 compute-0 openstack_network_exporter[205118]: ERROR   08:10:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:10:31 compute-0 openstack_network_exporter[205118]: ERROR   08:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:10:31 compute-0 openstack_network_exporter[205118]: ERROR   08:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:10:31 compute-0 openstack_network_exporter[205118]: ERROR   08:10:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:10:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:10:31 compute-0 openstack_network_exporter[205118]: ERROR   08:10:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:10:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:10:31 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:31.940 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:10:31 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:31.941 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:10:31 compute-0 nova_compute[192567]: 2025-10-02 08:10:31.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:34 compute-0 nova_compute[192567]: 2025-10-02 08:10:34.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:34 compute-0 nova_compute[192567]: 2025-10-02 08:10:34.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:38 compute-0 podman[215473]: 2025-10-02 08:10:38.182353614 +0000 UTC m=+0.085319425 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct 02 08:10:38 compute-0 podman[215478]: 2025-10-02 08:10:38.22482629 +0000 UTC m=+0.109942274 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 02 08:10:38 compute-0 podman[215474]: 2025-10-02 08:10:38.258383487 +0000 UTC m=+0.139598008 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:10:38 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:38.944 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:10:39 compute-0 nova_compute[192567]: 2025-10-02 08:10:39.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:39 compute-0 nova_compute[192567]: 2025-10-02 08:10:39.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.097 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "f13a8d11-bf67-4548-81bb-3bfd210a0471" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.097 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.121 2 DEBUG nova.compute.manager [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 02 08:10:40 compute-0 podman[215537]: 2025-10-02 08:10:40.202407443 +0000 UTC m=+0.112019229 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.226 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.227 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.233 2 DEBUG nova.virt.hardware [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.233 2 INFO nova.compute.claims [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.411 2 DEBUG nova.compute.provider_tree [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.428 2 DEBUG nova.scheduler.client.report [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.457 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.458 2 DEBUG nova.compute.manager [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.499 2 DEBUG nova.compute.manager [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.500 2 DEBUG nova.network.neutron [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.518 2 INFO nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.539 2 DEBUG nova.compute.manager [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.643 2 DEBUG nova.compute.manager [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.645 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.646 2 INFO nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Creating image(s)
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.647 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "/var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.647 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "/var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.649 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "/var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.676 2 DEBUG oslo_concurrency.processutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.740 2 DEBUG oslo_concurrency.processutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.741 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.742 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.752 2 DEBUG oslo_concurrency.processutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.818 2 DEBUG oslo_concurrency.processutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.819 2 DEBUG oslo_concurrency.processutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.873 2 DEBUG oslo_concurrency.processutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.875 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.875 2 DEBUG oslo_concurrency.processutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.957 2 DEBUG oslo_concurrency.processutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.958 2 DEBUG nova.virt.disk.api [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Checking if we can resize image /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:10:40 compute-0 nova_compute[192567]: 2025-10-02 08:10:40.958 2 DEBUG oslo_concurrency.processutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:10:41 compute-0 nova_compute[192567]: 2025-10-02 08:10:41.053 2 DEBUG oslo_concurrency.processutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:10:41 compute-0 nova_compute[192567]: 2025-10-02 08:10:41.055 2 DEBUG nova.virt.disk.api [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Cannot resize image /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:10:41 compute-0 nova_compute[192567]: 2025-10-02 08:10:41.055 2 DEBUG nova.objects.instance [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lazy-loading 'migration_context' on Instance uuid f13a8d11-bf67-4548-81bb-3bfd210a0471 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:10:41 compute-0 nova_compute[192567]: 2025-10-02 08:10:41.070 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 02 08:10:41 compute-0 nova_compute[192567]: 2025-10-02 08:10:41.071 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Ensure instance console log exists: /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:10:41 compute-0 nova_compute[192567]: 2025-10-02 08:10:41.072 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:10:41 compute-0 nova_compute[192567]: 2025-10-02 08:10:41.072 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:10:41 compute-0 nova_compute[192567]: 2025-10-02 08:10:41.072 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:10:43 compute-0 nova_compute[192567]: 2025-10-02 08:10:43.806 2 DEBUG nova.network.neutron [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Successfully created port: ea9a363a-d800-41d1-b7a3-819f91395719 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 02 08:10:44 compute-0 nova_compute[192567]: 2025-10-02 08:10:44.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:44 compute-0 nova_compute[192567]: 2025-10-02 08:10:44.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:45 compute-0 podman[215572]: 2025-10-02 08:10:45.192011334 +0000 UTC m=+0.092701391 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:10:45 compute-0 nova_compute[192567]: 2025-10-02 08:10:45.908 2 DEBUG nova.network.neutron [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Successfully updated port: ea9a363a-d800-41d1-b7a3-819f91395719 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 02 08:10:45 compute-0 nova_compute[192567]: 2025-10-02 08:10:45.933 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "refresh_cache-f13a8d11-bf67-4548-81bb-3bfd210a0471" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:10:45 compute-0 nova_compute[192567]: 2025-10-02 08:10:45.933 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquired lock "refresh_cache-f13a8d11-bf67-4548-81bb-3bfd210a0471" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:10:45 compute-0 nova_compute[192567]: 2025-10-02 08:10:45.933 2 DEBUG nova.network.neutron [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:10:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:45.967 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:10:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:45.968 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:10:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:45.969 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:10:46 compute-0 nova_compute[192567]: 2025-10-02 08:10:46.024 2 DEBUG nova.compute.manager [req-b654c5f5-4819-4ade-938c-7753e97871ea req-752a0b14-560c-4f3f-970d-69831fb15a9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Received event network-changed-ea9a363a-d800-41d1-b7a3-819f91395719 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:10:46 compute-0 nova_compute[192567]: 2025-10-02 08:10:46.025 2 DEBUG nova.compute.manager [req-b654c5f5-4819-4ade-938c-7753e97871ea req-752a0b14-560c-4f3f-970d-69831fb15a9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Refreshing instance network info cache due to event network-changed-ea9a363a-d800-41d1-b7a3-819f91395719. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:10:46 compute-0 nova_compute[192567]: 2025-10-02 08:10:46.025 2 DEBUG oslo_concurrency.lockutils [req-b654c5f5-4819-4ade-938c-7753e97871ea req-752a0b14-560c-4f3f-970d-69831fb15a9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-f13a8d11-bf67-4548-81bb-3bfd210a0471" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:10:46 compute-0 nova_compute[192567]: 2025-10-02 08:10:46.703 2 DEBUG nova.network.neutron [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.001 2 DEBUG nova.network.neutron [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Updating instance_info_cache with network_info: [{"id": "ea9a363a-d800-41d1-b7a3-819f91395719", "address": "fa:16:3e:7c:27:70", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea9a363a-d8", "ovs_interfaceid": "ea9a363a-d800-41d1-b7a3-819f91395719", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.031 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Releasing lock "refresh_cache-f13a8d11-bf67-4548-81bb-3bfd210a0471" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.031 2 DEBUG nova.compute.manager [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Instance network_info: |[{"id": "ea9a363a-d800-41d1-b7a3-819f91395719", "address": "fa:16:3e:7c:27:70", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea9a363a-d8", "ovs_interfaceid": "ea9a363a-d800-41d1-b7a3-819f91395719", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.031 2 DEBUG oslo_concurrency.lockutils [req-b654c5f5-4819-4ade-938c-7753e97871ea req-752a0b14-560c-4f3f-970d-69831fb15a9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-f13a8d11-bf67-4548-81bb-3bfd210a0471" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.031 2 DEBUG nova.network.neutron [req-b654c5f5-4819-4ade-938c-7753e97871ea req-752a0b14-560c-4f3f-970d-69831fb15a9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Refreshing network info cache for port ea9a363a-d800-41d1-b7a3-819f91395719 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.035 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Start _get_guest_xml network_info=[{"id": "ea9a363a-d800-41d1-b7a3-819f91395719", "address": "fa:16:3e:7c:27:70", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea9a363a-d8", "ovs_interfaceid": "ea9a363a-d800-41d1-b7a3-819f91395719", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.041 2 WARNING nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.052 2 DEBUG nova.virt.libvirt.host [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.052 2 DEBUG nova.virt.libvirt.host [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.056 2 DEBUG nova.virt.libvirt.host [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.057 2 DEBUG nova.virt.libvirt.host [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.057 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.058 2 DEBUG nova.virt.hardware [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.058 2 DEBUG nova.virt.hardware [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.058 2 DEBUG nova.virt.hardware [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.058 2 DEBUG nova.virt.hardware [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.059 2 DEBUG nova.virt.hardware [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.059 2 DEBUG nova.virt.hardware [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.059 2 DEBUG nova.virt.hardware [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.059 2 DEBUG nova.virt.hardware [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.059 2 DEBUG nova.virt.hardware [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.060 2 DEBUG nova.virt.hardware [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.060 2 DEBUG nova.virt.hardware [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.063 2 DEBUG nova.virt.libvirt.vif [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:10:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-922374601',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-922374601',id=4,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-itjzem6y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:10:40Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=f13a8d11-bf67-4548-81bb-3bfd210a0471,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea9a363a-d800-41d1-b7a3-819f91395719", "address": "fa:16:3e:7c:27:70", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea9a363a-d8", "ovs_interfaceid": "ea9a363a-d800-41d1-b7a3-819f91395719", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.063 2 DEBUG nova.network.os_vif_util [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converting VIF {"id": "ea9a363a-d800-41d1-b7a3-819f91395719", "address": "fa:16:3e:7c:27:70", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea9a363a-d8", "ovs_interfaceid": "ea9a363a-d800-41d1-b7a3-819f91395719", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.064 2 DEBUG nova.network.os_vif_util [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:27:70,bridge_name='br-int',has_traffic_filtering=True,id=ea9a363a-d800-41d1-b7a3-819f91395719,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea9a363a-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.064 2 DEBUG nova.objects.instance [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lazy-loading 'pci_devices' on Instance uuid f13a8d11-bf67-4548-81bb-3bfd210a0471 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.081 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:10:48 compute-0 nova_compute[192567]:   <uuid>f13a8d11-bf67-4548-81bb-3bfd210a0471</uuid>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   <name>instance-00000004</name>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-922374601</nova:name>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:10:48</nova:creationTime>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:10:48 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:10:48 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:10:48 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:10:48 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:10:48 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:10:48 compute-0 nova_compute[192567]:         <nova:user uuid="4b5c71b386a34e829eef47bf613d813c">tempest-TestExecuteActionsViaActuator-547955480-project-admin</nova:user>
Oct 02 08:10:48 compute-0 nova_compute[192567]:         <nova:project uuid="a5d6400b4e3f4d98a7456330f6429bd5">tempest-TestExecuteActionsViaActuator-547955480</nova:project>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:10:48 compute-0 nova_compute[192567]:         <nova:port uuid="ea9a363a-d800-41d1-b7a3-819f91395719">
Oct 02 08:10:48 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <system>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <entry name="serial">f13a8d11-bf67-4548-81bb-3bfd210a0471</entry>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <entry name="uuid">f13a8d11-bf67-4548-81bb-3bfd210a0471</entry>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     </system>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   <os>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   </os>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   <features>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   </features>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk.config"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:7c:27:70"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <target dev="tapea9a363a-d8"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/console.log" append="off"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <video>
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     </video>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:10:48 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:10:48 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:10:48 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:10:48 compute-0 nova_compute[192567]: </domain>
Oct 02 08:10:48 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.082 2 DEBUG nova.compute.manager [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Preparing to wait for external event network-vif-plugged-ea9a363a-d800-41d1-b7a3-819f91395719 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.082 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.082 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.082 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.083 2 DEBUG nova.virt.libvirt.vif [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:10:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-922374601',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-922374601',id=4,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-itjzem6y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:10:40Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=f13a8d11-bf67-4548-81bb-3bfd210a0471,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea9a363a-d800-41d1-b7a3-819f91395719", "address": "fa:16:3e:7c:27:70", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea9a363a-d8", "ovs_interfaceid": "ea9a363a-d800-41d1-b7a3-819f91395719", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.083 2 DEBUG nova.network.os_vif_util [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converting VIF {"id": "ea9a363a-d800-41d1-b7a3-819f91395719", "address": "fa:16:3e:7c:27:70", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea9a363a-d8", "ovs_interfaceid": "ea9a363a-d800-41d1-b7a3-819f91395719", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.084 2 DEBUG nova.network.os_vif_util [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:27:70,bridge_name='br-int',has_traffic_filtering=True,id=ea9a363a-d800-41d1-b7a3-819f91395719,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea9a363a-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.084 2 DEBUG os_vif [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:27:70,bridge_name='br-int',has_traffic_filtering=True,id=ea9a363a-d800-41d1-b7a3-819f91395719,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea9a363a-d8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.085 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.085 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.089 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea9a363a-d8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.089 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapea9a363a-d8, col_values=(('external_ids', {'iface-id': 'ea9a363a-d800-41d1-b7a3-819f91395719', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:27:70', 'vm-uuid': 'f13a8d11-bf67-4548-81bb-3bfd210a0471'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:10:48 compute-0 NetworkManager[51654]: <info>  [1759392648.0922] manager: (tapea9a363a-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.104 2 INFO os_vif [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:27:70,bridge_name='br-int',has_traffic_filtering=True,id=ea9a363a-d800-41d1-b7a3-819f91395719,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea9a363a-d8')
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.163 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.163 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.164 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] No VIF found with MAC fa:16:3e:7c:27:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.165 2 INFO nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Using config drive
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.886 2 INFO nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Creating config drive at /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk.config
Oct 02 08:10:48 compute-0 nova_compute[192567]: 2025-10-02 08:10:48.895 2 DEBUG oslo_concurrency.processutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwx6uvsdz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:10:49 compute-0 nova_compute[192567]: 2025-10-02 08:10:49.019 2 DEBUG oslo_concurrency.processutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwx6uvsdz" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:10:49 compute-0 kernel: tapea9a363a-d8: entered promiscuous mode
Oct 02 08:10:49 compute-0 NetworkManager[51654]: <info>  [1759392649.0991] manager: (tapea9a363a-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/27)
Oct 02 08:10:49 compute-0 ovn_controller[94821]: 2025-10-02T08:10:49Z|00033|binding|INFO|Claiming lport ea9a363a-d800-41d1-b7a3-819f91395719 for this chassis.
Oct 02 08:10:49 compute-0 ovn_controller[94821]: 2025-10-02T08:10:49Z|00034|binding|INFO|ea9a363a-d800-41d1-b7a3-819f91395719: Claiming fa:16:3e:7c:27:70 10.100.0.9
Oct 02 08:10:49 compute-0 nova_compute[192567]: 2025-10-02 08:10:49.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.110 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:27:70 10.100.0.9'], port_security=['fa:16:3e:7c:27:70 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f13a8d11-bf67-4548-81bb-3bfd210a0471', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5d6400b4e3f4d98a7456330f6429bd5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cba96dbc-c401-4d81-b355-4680d6ad5e15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d06b6f4b-ccde-4903-a1fe-e6bac9f52057, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=ea9a363a-d800-41d1-b7a3-819f91395719) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.113 103703 INFO neutron.agent.ovn.metadata.agent [-] Port ea9a363a-d800-41d1-b7a3-819f91395719 in datapath 441198e3-04ff-48aa-b8a7-2339e4bb8085 bound to our chassis
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.117 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 441198e3-04ff-48aa-b8a7-2339e4bb8085
Oct 02 08:10:49 compute-0 ovn_controller[94821]: 2025-10-02T08:10:49Z|00035|binding|INFO|Setting lport ea9a363a-d800-41d1-b7a3-819f91395719 ovn-installed in OVS
Oct 02 08:10:49 compute-0 ovn_controller[94821]: 2025-10-02T08:10:49Z|00036|binding|INFO|Setting lport ea9a363a-d800-41d1-b7a3-819f91395719 up in Southbound
Oct 02 08:10:49 compute-0 nova_compute[192567]: 2025-10-02 08:10:49.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:49 compute-0 nova_compute[192567]: 2025-10-02 08:10:49.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.141 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[631a27db-ef50-45aa-b332-4dac63c86b26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:10:49 compute-0 systemd-machined[152597]: New machine qemu-2-instance-00000004.
Oct 02 08:10:49 compute-0 systemd-udevd[215618]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:10:49 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.180 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[39e4bae6-afe5-4f78-bdfe-7d90074e577f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:10:49 compute-0 NetworkManager[51654]: <info>  [1759392649.1870] device (tapea9a363a-d8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.187 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[b5e3b203-3a69-40cf-b0b5-1a6139e90fd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:10:49 compute-0 NetworkManager[51654]: <info>  [1759392649.1894] device (tapea9a363a-d8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.231 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[2deddd12-4716-4f95-96ee-30790045f7e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.251 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[2b16026b-76fc-493a-b322-b15a68edea46]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap441198e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:13:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348164, 'reachable_time': 44332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215626, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.275 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[6aca7dd3-b17f-4f48-942a-2713226a9de7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348180, 'tstamp': 348180}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215629, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348183, 'tstamp': 348183}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215629, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.277 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap441198e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:10:49 compute-0 nova_compute[192567]: 2025-10-02 08:10:49.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:49 compute-0 nova_compute[192567]: 2025-10-02 08:10:49.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.282 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap441198e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.283 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.284 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap441198e3-00, col_values=(('external_ids', {'iface-id': 'f4e4745f-6cb7-4dfe-930a-ab5c5f2db11b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:10:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:10:49.284 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:10:49 compute-0 nova_compute[192567]: 2025-10-02 08:10:49.876 2 DEBUG nova.compute.manager [req-01bd990b-4d59-4e11-96c8-b7d27671dfab req-193d6fd3-75b1-4fd7-a5e0-e7d349d27a03 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Received event network-vif-plugged-ea9a363a-d800-41d1-b7a3-819f91395719 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:10:49 compute-0 nova_compute[192567]: 2025-10-02 08:10:49.877 2 DEBUG oslo_concurrency.lockutils [req-01bd990b-4d59-4e11-96c8-b7d27671dfab req-193d6fd3-75b1-4fd7-a5e0-e7d349d27a03 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:10:49 compute-0 nova_compute[192567]: 2025-10-02 08:10:49.877 2 DEBUG oslo_concurrency.lockutils [req-01bd990b-4d59-4e11-96c8-b7d27671dfab req-193d6fd3-75b1-4fd7-a5e0-e7d349d27a03 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:10:49 compute-0 nova_compute[192567]: 2025-10-02 08:10:49.878 2 DEBUG oslo_concurrency.lockutils [req-01bd990b-4d59-4e11-96c8-b7d27671dfab req-193d6fd3-75b1-4fd7-a5e0-e7d349d27a03 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:10:49 compute-0 nova_compute[192567]: 2025-10-02 08:10:49.878 2 DEBUG nova.compute.manager [req-01bd990b-4d59-4e11-96c8-b7d27671dfab req-193d6fd3-75b1-4fd7-a5e0-e7d349d27a03 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Processing event network-vif-plugged-ea9a363a-d800-41d1-b7a3-819f91395719 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.238 2 DEBUG nova.network.neutron [req-b654c5f5-4819-4ade-938c-7753e97871ea req-752a0b14-560c-4f3f-970d-69831fb15a9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Updated VIF entry in instance network info cache for port ea9a363a-d800-41d1-b7a3-819f91395719. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.239 2 DEBUG nova.network.neutron [req-b654c5f5-4819-4ade-938c-7753e97871ea req-752a0b14-560c-4f3f-970d-69831fb15a9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Updating instance_info_cache with network_info: [{"id": "ea9a363a-d800-41d1-b7a3-819f91395719", "address": "fa:16:3e:7c:27:70", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea9a363a-d8", "ovs_interfaceid": "ea9a363a-d800-41d1-b7a3-819f91395719", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.256 2 DEBUG oslo_concurrency.lockutils [req-b654c5f5-4819-4ade-938c-7753e97871ea req-752a0b14-560c-4f3f-970d-69831fb15a9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-f13a8d11-bf67-4548-81bb-3bfd210a0471" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.344 2 DEBUG nova.compute.manager [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.345 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392650.3440619, f13a8d11-bf67-4548-81bb-3bfd210a0471 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.346 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] VM Started (Lifecycle Event)
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.353 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.358 2 INFO nova.virt.libvirt.driver [-] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Instance spawned successfully.
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.359 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.377 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.385 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.390 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.391 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.392 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.392 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.393 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.394 2 DEBUG nova.virt.libvirt.driver [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.439 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.440 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392650.3486886, f13a8d11-bf67-4548-81bb-3bfd210a0471 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.441 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] VM Paused (Lifecycle Event)
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.480 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.485 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392650.3506064, f13a8d11-bf67-4548-81bb-3bfd210a0471 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.486 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] VM Resumed (Lifecycle Event)
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.489 2 INFO nova.compute.manager [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Took 9.85 seconds to spawn the instance on the hypervisor.
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.490 2 DEBUG nova.compute.manager [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.504 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.510 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.540 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.568 2 INFO nova.compute.manager [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Took 10.38 seconds to build instance.
Oct 02 08:10:50 compute-0 nova_compute[192567]: 2025-10-02 08:10:50.588 2 DEBUG oslo_concurrency.lockutils [None req-a736244c-827c-455c-9404-942d4e79b041 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.491s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:10:52 compute-0 nova_compute[192567]: 2025-10-02 08:10:52.024 2 DEBUG nova.compute.manager [req-35865ddc-1297-45b2-b4b8-551f551b8b33 req-a0d496ff-b8e5-4e8f-9e63-ca63f8cf9660 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Received event network-vif-plugged-ea9a363a-d800-41d1-b7a3-819f91395719 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:10:52 compute-0 nova_compute[192567]: 2025-10-02 08:10:52.025 2 DEBUG oslo_concurrency.lockutils [req-35865ddc-1297-45b2-b4b8-551f551b8b33 req-a0d496ff-b8e5-4e8f-9e63-ca63f8cf9660 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:10:52 compute-0 nova_compute[192567]: 2025-10-02 08:10:52.025 2 DEBUG oslo_concurrency.lockutils [req-35865ddc-1297-45b2-b4b8-551f551b8b33 req-a0d496ff-b8e5-4e8f-9e63-ca63f8cf9660 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:10:52 compute-0 nova_compute[192567]: 2025-10-02 08:10:52.025 2 DEBUG oslo_concurrency.lockutils [req-35865ddc-1297-45b2-b4b8-551f551b8b33 req-a0d496ff-b8e5-4e8f-9e63-ca63f8cf9660 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:10:52 compute-0 nova_compute[192567]: 2025-10-02 08:10:52.026 2 DEBUG nova.compute.manager [req-35865ddc-1297-45b2-b4b8-551f551b8b33 req-a0d496ff-b8e5-4e8f-9e63-ca63f8cf9660 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] No waiting events found dispatching network-vif-plugged-ea9a363a-d800-41d1-b7a3-819f91395719 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:10:52 compute-0 nova_compute[192567]: 2025-10-02 08:10:52.027 2 WARNING nova.compute.manager [req-35865ddc-1297-45b2-b4b8-551f551b8b33 req-a0d496ff-b8e5-4e8f-9e63-ca63f8cf9660 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Received unexpected event network-vif-plugged-ea9a363a-d800-41d1-b7a3-819f91395719 for instance with vm_state active and task_state None.
Oct 02 08:10:53 compute-0 nova_compute[192567]: 2025-10-02 08:10:53.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:55 compute-0 nova_compute[192567]: 2025-10-02 08:10:55.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:58 compute-0 nova_compute[192567]: 2025-10-02 08:10:58.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:10:59 compute-0 podman[203011]: time="2025-10-02T08:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:10:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:10:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3458 "" "Go-http-client/1.1"
Oct 02 08:11:00 compute-0 nova_compute[192567]: 2025-10-02 08:11:00.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:00 compute-0 podman[215645]: 2025-10-02 08:11:00.230573537 +0000 UTC m=+0.128592371 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 02 08:11:01 compute-0 openstack_network_exporter[205118]: ERROR   08:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:11:01 compute-0 openstack_network_exporter[205118]: ERROR   08:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:11:01 compute-0 openstack_network_exporter[205118]: ERROR   08:11:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:11:01 compute-0 openstack_network_exporter[205118]: ERROR   08:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:11:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:11:01 compute-0 openstack_network_exporter[205118]: ERROR   08:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:11:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:11:02 compute-0 ovn_controller[94821]: 2025-10-02T08:11:02Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:27:70 10.100.0.9
Oct 02 08:11:02 compute-0 ovn_controller[94821]: 2025-10-02T08:11:02Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:27:70 10.100.0.9
Oct 02 08:11:03 compute-0 nova_compute[192567]: 2025-10-02 08:11:03.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:05 compute-0 nova_compute[192567]: 2025-10-02 08:11:05.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:08 compute-0 nova_compute[192567]: 2025-10-02 08:11:08.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:09 compute-0 podman[215677]: 2025-10-02 08:11:09.181026264 +0000 UTC m=+0.084981922 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct 02 08:11:09 compute-0 podman[215679]: 2025-10-02 08:11:09.209546917 +0000 UTC m=+0.101093640 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=multipathd, container_name=multipathd)
Oct 02 08:11:09 compute-0 podman[215678]: 2025-10-02 08:11:09.221069424 +0000 UTC m=+0.125320161 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct 02 08:11:10 compute-0 nova_compute[192567]: 2025-10-02 08:11:10.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:11 compute-0 podman[215737]: 2025-10-02 08:11:11.163582058 +0000 UTC m=+0.084740905 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct 02 08:11:13 compute-0 nova_compute[192567]: 2025-10-02 08:11:13.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:15 compute-0 nova_compute[192567]: 2025-10-02 08:11:15.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:16 compute-0 podman[215757]: 2025-10-02 08:11:16.156701727 +0000 UTC m=+0.069825702 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:11:17 compute-0 nova_compute[192567]: 2025-10-02 08:11:17.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:17 compute-0 nova_compute[192567]: 2025-10-02 08:11:17.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 02 08:11:17 compute-0 nova_compute[192567]: 2025-10-02 08:11:17.641 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.649 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.685 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.687 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.688 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.688 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.789 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.858 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.859 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.918 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.926 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.993 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:18 compute-0 nova_compute[192567]: 2025-10-02 08:11:18.995 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.086 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.309 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.311 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5615MB free_disk=73.41180038452148GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.311 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.312 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.515 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance 2e661e5f-2462-4ffd-99a7-afc83d45f425 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.516 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance f13a8d11-bf67-4548-81bb-3bfd210a0471 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.516 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.516 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.671 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.688 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.715 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:11:19 compute-0 nova_compute[192567]: 2025-10-02 08:11:19.716 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.404s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:20 compute-0 nova_compute[192567]: 2025-10-02 08:11:20.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.055 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "ed125c83-0f73-41e4-925c-db2354932843" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.056 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.080 2 DEBUG nova.compute.manager [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.176 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.177 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.188 2 DEBUG nova.virt.hardware [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.188 2 INFO nova.compute.claims [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.397 2 DEBUG nova.compute.provider_tree [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.410 2 DEBUG nova.scheduler.client.report [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.428 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.428 2 DEBUG nova.compute.manager [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.463 2 DEBUG nova.compute.manager [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.463 2 DEBUG nova.network.neutron [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.482 2 INFO nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.497 2 DEBUG nova.compute.manager [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.607 2 DEBUG nova.compute.manager [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.609 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.609 2 INFO nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Creating image(s)
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.609 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "/var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.610 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "/var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.610 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "/var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.623 2 DEBUG oslo_concurrency.processutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.686 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.688 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.688 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.696 2 DEBUG oslo_concurrency.processutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.697 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.697 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.708 2 DEBUG oslo_concurrency.processutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.776 2 DEBUG oslo_concurrency.processutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.777 2 DEBUG oslo_concurrency.processutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.810 2 DEBUG oslo_concurrency.processutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.812 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.812 2 DEBUG oslo_concurrency.processutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.868 2 DEBUG oslo_concurrency.processutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.870 2 DEBUG nova.virt.disk.api [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Checking if we can resize image /var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.871 2 DEBUG oslo_concurrency.processutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.927 2 DEBUG oslo_concurrency.processutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.928 2 DEBUG nova.virt.disk.api [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Cannot resize image /var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.929 2 DEBUG nova.objects.instance [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lazy-loading 'migration_context' on Instance uuid ed125c83-0f73-41e4-925c-db2354932843 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.946 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.947 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Ensure instance console log exists: /var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.948 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.948 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:22 compute-0 nova_compute[192567]: 2025-10-02 08:11:22.949 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:23.111 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:11:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:23.112 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.181 2 DEBUG nova.network.neutron [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Successfully created port: 11fd9ac4-a789-4053-a2ed-1bf04b861368 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.659 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: ed125c83-0f73-41e4-925c-db2354932843] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.864 2 DEBUG nova.network.neutron [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Successfully updated port: 11fd9ac4-a789-4053-a2ed-1bf04b861368 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.869 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-2e661e5f-2462-4ffd-99a7-afc83d45f425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.870 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-2e661e5f-2462-4ffd-99a7-afc83d45f425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.870 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.870 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2e661e5f-2462-4ffd-99a7-afc83d45f425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.894 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "refresh_cache-ed125c83-0f73-41e4-925c-db2354932843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.895 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquired lock "refresh_cache-ed125c83-0f73-41e4-925c-db2354932843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.896 2 DEBUG nova.network.neutron [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.970 2 DEBUG nova.compute.manager [req-c20c930a-694a-49a5-a1b1-bc299e687be5 req-c454ea3f-b9c7-4f2c-8ea2-45a233b942f6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Received event network-changed-11fd9ac4-a789-4053-a2ed-1bf04b861368 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.970 2 DEBUG nova.compute.manager [req-c20c930a-694a-49a5-a1b1-bc299e687be5 req-c454ea3f-b9c7-4f2c-8ea2-45a233b942f6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Refreshing instance network info cache due to event network-changed-11fd9ac4-a789-4053-a2ed-1bf04b861368. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:11:23 compute-0 nova_compute[192567]: 2025-10-02 08:11:23.971 2 DEBUG oslo_concurrency.lockutils [req-c20c930a-694a-49a5-a1b1-bc299e687be5 req-c454ea3f-b9c7-4f2c-8ea2-45a233b942f6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-ed125c83-0f73-41e4-925c-db2354932843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:11:24 compute-0 nova_compute[192567]: 2025-10-02 08:11:24.124 2 DEBUG nova.network.neutron [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:11:24 compute-0 nova_compute[192567]: 2025-10-02 08:11:24.961 2 DEBUG nova.network.neutron [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Updating instance_info_cache with network_info: [{"id": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "address": "fa:16:3e:52:70:8c", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11fd9ac4-a7", "ovs_interfaceid": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:11:24 compute-0 nova_compute[192567]: 2025-10-02 08:11:24.980 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Releasing lock "refresh_cache-ed125c83-0f73-41e4-925c-db2354932843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:11:24 compute-0 nova_compute[192567]: 2025-10-02 08:11:24.980 2 DEBUG nova.compute.manager [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Instance network_info: |[{"id": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "address": "fa:16:3e:52:70:8c", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11fd9ac4-a7", "ovs_interfaceid": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 02 08:11:24 compute-0 nova_compute[192567]: 2025-10-02 08:11:24.980 2 DEBUG oslo_concurrency.lockutils [req-c20c930a-694a-49a5-a1b1-bc299e687be5 req-c454ea3f-b9c7-4f2c-8ea2-45a233b942f6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-ed125c83-0f73-41e4-925c-db2354932843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:11:24 compute-0 nova_compute[192567]: 2025-10-02 08:11:24.981 2 DEBUG nova.network.neutron [req-c20c930a-694a-49a5-a1b1-bc299e687be5 req-c454ea3f-b9c7-4f2c-8ea2-45a233b942f6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Refreshing network info cache for port 11fd9ac4-a789-4053-a2ed-1bf04b861368 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:11:24 compute-0 nova_compute[192567]: 2025-10-02 08:11:24.984 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Start _get_guest_xml network_info=[{"id": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "address": "fa:16:3e:52:70:8c", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11fd9ac4-a7", "ovs_interfaceid": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:11:24 compute-0 nova_compute[192567]: 2025-10-02 08:11:24.990 2 WARNING nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.000 2 DEBUG nova.virt.libvirt.host [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.000 2 DEBUG nova.virt.libvirt.host [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.011 2 DEBUG nova.virt.libvirt.host [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.011 2 DEBUG nova.virt.libvirt.host [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.012 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.012 2 DEBUG nova.virt.hardware [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.013 2 DEBUG nova.virt.hardware [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.014 2 DEBUG nova.virt.hardware [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.014 2 DEBUG nova.virt.hardware [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.015 2 DEBUG nova.virt.hardware [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.016 2 DEBUG nova.virt.hardware [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.016 2 DEBUG nova.virt.hardware [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.017 2 DEBUG nova.virt.hardware [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.017 2 DEBUG nova.virt.hardware [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.018 2 DEBUG nova.virt.hardware [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.018 2 DEBUG nova.virt.hardware [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.024 2 DEBUG nova.virt.libvirt.vif [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1025051940',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1025051940',id=6,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-8hhpdkh2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:11:22Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=ed125c83-0f73-41e4-925c-db2354932843,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "address": "fa:16:3e:52:70:8c", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11fd9ac4-a7", "ovs_interfaceid": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.024 2 DEBUG nova.network.os_vif_util [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converting VIF {"id": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "address": "fa:16:3e:52:70:8c", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11fd9ac4-a7", "ovs_interfaceid": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.025 2 DEBUG nova.network.os_vif_util [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:70:8c,bridge_name='br-int',has_traffic_filtering=True,id=11fd9ac4-a789-4053-a2ed-1bf04b861368,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11fd9ac4-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.027 2 DEBUG nova.objects.instance [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lazy-loading 'pci_devices' on Instance uuid ed125c83-0f73-41e4-925c-db2354932843 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.042 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:11:25 compute-0 nova_compute[192567]:   <uuid>ed125c83-0f73-41e4-925c-db2354932843</uuid>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   <name>instance-00000006</name>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1025051940</nova:name>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:11:24</nova:creationTime>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:11:25 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:11:25 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:11:25 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:11:25 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:11:25 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:11:25 compute-0 nova_compute[192567]:         <nova:user uuid="4b5c71b386a34e829eef47bf613d813c">tempest-TestExecuteActionsViaActuator-547955480-project-admin</nova:user>
Oct 02 08:11:25 compute-0 nova_compute[192567]:         <nova:project uuid="a5d6400b4e3f4d98a7456330f6429bd5">tempest-TestExecuteActionsViaActuator-547955480</nova:project>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:11:25 compute-0 nova_compute[192567]:         <nova:port uuid="11fd9ac4-a789-4053-a2ed-1bf04b861368">
Oct 02 08:11:25 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <system>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <entry name="serial">ed125c83-0f73-41e4-925c-db2354932843</entry>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <entry name="uuid">ed125c83-0f73-41e4-925c-db2354932843</entry>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     </system>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   <os>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   </os>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   <features>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   </features>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk.config"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:52:70:8c"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <target dev="tap11fd9ac4-a7"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/console.log" append="off"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <video>
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     </video>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:11:25 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:11:25 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:11:25 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:11:25 compute-0 nova_compute[192567]: </domain>
Oct 02 08:11:25 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.044 2 DEBUG nova.compute.manager [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Preparing to wait for external event network-vif-plugged-11fd9ac4-a789-4053-a2ed-1bf04b861368 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.044 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "ed125c83-0f73-41e4-925c-db2354932843-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.045 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.045 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.046 2 DEBUG nova.virt.libvirt.vif [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1025051940',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1025051940',id=6,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-8hhpdkh2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:11:22Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=ed125c83-0f73-41e4-925c-db2354932843,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "address": "fa:16:3e:52:70:8c", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11fd9ac4-a7", "ovs_interfaceid": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.046 2 DEBUG nova.network.os_vif_util [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converting VIF {"id": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "address": "fa:16:3e:52:70:8c", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11fd9ac4-a7", "ovs_interfaceid": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.047 2 DEBUG nova.network.os_vif_util [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:70:8c,bridge_name='br-int',has_traffic_filtering=True,id=11fd9ac4-a789-4053-a2ed-1bf04b861368,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11fd9ac4-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.048 2 DEBUG os_vif [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:70:8c,bridge_name='br-int',has_traffic_filtering=True,id=11fd9ac4-a789-4053-a2ed-1bf04b861368,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11fd9ac4-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.049 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.049 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.054 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap11fd9ac4-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.054 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap11fd9ac4-a7, col_values=(('external_ids', {'iface-id': '11fd9ac4-a789-4053-a2ed-1bf04b861368', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:70:8c', 'vm-uuid': 'ed125c83-0f73-41e4-925c-db2354932843'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:25 compute-0 NetworkManager[51654]: <info>  [1759392685.0571] manager: (tap11fd9ac4-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.066 2 INFO os_vif [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:70:8c,bridge_name='br-int',has_traffic_filtering=True,id=11fd9ac4-a789-4053-a2ed-1bf04b861368,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11fd9ac4-a7')
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.111 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.112 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.113 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] No VIF found with MAC fa:16:3e:52:70:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.114 2 INFO nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Using config drive
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.165 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Updating instance_info_cache with network_info: [{"id": "782354d7-2469-4521-9850-4777d41a0047", "address": "fa:16:3e:6c:f4:5e", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap782354d7-24", "ovs_interfaceid": "782354d7-2469-4521-9850-4777d41a0047", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.188 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-2e661e5f-2462-4ffd-99a7-afc83d45f425" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.188 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.189 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.190 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.190 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.200 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.471 2 INFO nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Creating config drive at /var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk.config
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.477 2 DEBUG oslo_concurrency.processutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8udp7289 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.614 2 DEBUG oslo_concurrency.processutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8udp7289" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.635 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.635 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:11:25 compute-0 kernel: tap11fd9ac4-a7: entered promiscuous mode
Oct 02 08:11:25 compute-0 NetworkManager[51654]: <info>  [1759392685.6996] manager: (tap11fd9ac4-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:25 compute-0 ovn_controller[94821]: 2025-10-02T08:11:25Z|00037|binding|INFO|Claiming lport 11fd9ac4-a789-4053-a2ed-1bf04b861368 for this chassis.
Oct 02 08:11:25 compute-0 ovn_controller[94821]: 2025-10-02T08:11:25Z|00038|binding|INFO|11fd9ac4-a789-4053-a2ed-1bf04b861368: Claiming fa:16:3e:52:70:8c 10.100.0.12
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.708 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:70:8c 10.100.0.12'], port_security=['fa:16:3e:52:70:8c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ed125c83-0f73-41e4-925c-db2354932843', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5d6400b4e3f4d98a7456330f6429bd5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cba96dbc-c401-4d81-b355-4680d6ad5e15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d06b6f4b-ccde-4903-a1fe-e6bac9f52057, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=11fd9ac4-a789-4053-a2ed-1bf04b861368) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.711 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 11fd9ac4-a789-4053-a2ed-1bf04b861368 in datapath 441198e3-04ff-48aa-b8a7-2339e4bb8085 bound to our chassis
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.714 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 441198e3-04ff-48aa-b8a7-2339e4bb8085
Oct 02 08:11:25 compute-0 ovn_controller[94821]: 2025-10-02T08:11:25Z|00039|binding|INFO|Setting lport 11fd9ac4-a789-4053-a2ed-1bf04b861368 ovn-installed in OVS
Oct 02 08:11:25 compute-0 ovn_controller[94821]: 2025-10-02T08:11:25Z|00040|binding|INFO|Setting lport 11fd9ac4-a789-4053-a2ed-1bf04b861368 up in Southbound
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.744 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[043e0cd3-e3ae-4d74-96dd-221256588dd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:25 compute-0 systemd-udevd[215830]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:11:25 compute-0 NetworkManager[51654]: <info>  [1759392685.7764] device (tap11fd9ac4-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:11:25 compute-0 NetworkManager[51654]: <info>  [1759392685.7779] device (tap11fd9ac4-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:11:25 compute-0 systemd-machined[152597]: New machine qemu-3-instance-00000006.
Oct 02 08:11:25 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000006.
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.791 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[83b209bc-5b8a-4a2c-b3b4-7e46d1ce1cc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.795 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[68f4e9b2-0276-486d-b09f-b6ef01f61859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.837 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[fb552806-2da3-4a86-a1a2-bad3c93d29fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.861 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[37868b56-f961-43b4-9a98-78b729431ad6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap441198e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:13:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348164, 'reachable_time': 44332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215841, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.884 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[6412546a-0f81-4a33-a3eb-41c4dab10eb4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348180, 'tstamp': 348180}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215844, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348183, 'tstamp': 348183}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215844, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.886 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap441198e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:25 compute-0 nova_compute[192567]: 2025-10-02 08:11:25.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.891 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap441198e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.891 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.892 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap441198e3-00, col_values=(('external_ids', {'iface-id': 'f4e4745f-6cb7-4dfe-930a-ab5c5f2db11b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:25 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:25.893 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:11:27 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:27.114 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.125 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392687.1248517, ed125c83-0f73-41e4-925c-db2354932843 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.126 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ed125c83-0f73-41e4-925c-db2354932843] VM Started (Lifecycle Event)
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.155 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ed125c83-0f73-41e4-925c-db2354932843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.160 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392687.1259542, ed125c83-0f73-41e4-925c-db2354932843 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.161 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ed125c83-0f73-41e4-925c-db2354932843] VM Paused (Lifecycle Event)
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.182 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ed125c83-0f73-41e4-925c-db2354932843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.187 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ed125c83-0f73-41e4-925c-db2354932843] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.208 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ed125c83-0f73-41e4-925c-db2354932843] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.950 2 DEBUG nova.compute.manager [req-695a7ed3-fe33-463c-8700-3fcf2939b05c req-8c80eec4-7c3c-4b67-ae43-2bc4d2f58d14 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Received event network-vif-plugged-11fd9ac4-a789-4053-a2ed-1bf04b861368 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.951 2 DEBUG oslo_concurrency.lockutils [req-695a7ed3-fe33-463c-8700-3fcf2939b05c req-8c80eec4-7c3c-4b67-ae43-2bc4d2f58d14 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "ed125c83-0f73-41e4-925c-db2354932843-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.952 2 DEBUG oslo_concurrency.lockutils [req-695a7ed3-fe33-463c-8700-3fcf2939b05c req-8c80eec4-7c3c-4b67-ae43-2bc4d2f58d14 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.952 2 DEBUG oslo_concurrency.lockutils [req-695a7ed3-fe33-463c-8700-3fcf2939b05c req-8c80eec4-7c3c-4b67-ae43-2bc4d2f58d14 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.953 2 DEBUG nova.compute.manager [req-695a7ed3-fe33-463c-8700-3fcf2939b05c req-8c80eec4-7c3c-4b67-ae43-2bc4d2f58d14 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Processing event network-vif-plugged-11fd9ac4-a789-4053-a2ed-1bf04b861368 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.954 2 DEBUG nova.compute.manager [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.959 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392687.9596317, ed125c83-0f73-41e4-925c-db2354932843 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.960 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ed125c83-0f73-41e4-925c-db2354932843] VM Resumed (Lifecycle Event)
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.964 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.969 2 INFO nova.virt.libvirt.driver [-] [instance: ed125c83-0f73-41e4-925c-db2354932843] Instance spawned successfully.
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.971 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 02 08:11:27 compute-0 nova_compute[192567]: 2025-10-02 08:11:27.991 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ed125c83-0f73-41e4-925c-db2354932843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.000 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ed125c83-0f73-41e4-925c-db2354932843] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.007 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.008 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.009 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.010 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.011 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.011 2 DEBUG nova.virt.libvirt.driver [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.026 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ed125c83-0f73-41e4-925c-db2354932843] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.047 2 DEBUG nova.network.neutron [req-c20c930a-694a-49a5-a1b1-bc299e687be5 req-c454ea3f-b9c7-4f2c-8ea2-45a233b942f6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Updated VIF entry in instance network info cache for port 11fd9ac4-a789-4053-a2ed-1bf04b861368. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.047 2 DEBUG nova.network.neutron [req-c20c930a-694a-49a5-a1b1-bc299e687be5 req-c454ea3f-b9c7-4f2c-8ea2-45a233b942f6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Updating instance_info_cache with network_info: [{"id": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "address": "fa:16:3e:52:70:8c", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11fd9ac4-a7", "ovs_interfaceid": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.070 2 DEBUG oslo_concurrency.lockutils [req-c20c930a-694a-49a5-a1b1-bc299e687be5 req-c454ea3f-b9c7-4f2c-8ea2-45a233b942f6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-ed125c83-0f73-41e4-925c-db2354932843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.086 2 INFO nova.compute.manager [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Took 5.48 seconds to spawn the instance on the hypervisor.
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.086 2 DEBUG nova.compute.manager [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.148 2 INFO nova.compute.manager [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Took 6.00 seconds to build instance.
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.166 2 DEBUG oslo_concurrency.lockutils [None req-7819e7d6-4f14-4697-9e53-fede74a9cb51 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:28 compute-0 nova_compute[192567]: 2025-10-02 08:11:28.642 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:29 compute-0 nova_compute[192567]: 2025-10-02 08:11:29.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:29 compute-0 podman[203011]: time="2025-10-02T08:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:11:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:11:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3462 "" "Go-http-client/1.1"
Oct 02 08:11:30 compute-0 nova_compute[192567]: 2025-10-02 08:11:30.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:30 compute-0 nova_compute[192567]: 2025-10-02 08:11:30.092 2 DEBUG nova.compute.manager [req-1ad017af-2686-43a0-9e0d-6998cbe9811d req-9db75248-c49f-4531-b5a5-0f9db6674edb 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Received event network-vif-plugged-11fd9ac4-a789-4053-a2ed-1bf04b861368 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:11:30 compute-0 nova_compute[192567]: 2025-10-02 08:11:30.093 2 DEBUG oslo_concurrency.lockutils [req-1ad017af-2686-43a0-9e0d-6998cbe9811d req-9db75248-c49f-4531-b5a5-0f9db6674edb 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "ed125c83-0f73-41e4-925c-db2354932843-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:30 compute-0 nova_compute[192567]: 2025-10-02 08:11:30.094 2 DEBUG oslo_concurrency.lockutils [req-1ad017af-2686-43a0-9e0d-6998cbe9811d req-9db75248-c49f-4531-b5a5-0f9db6674edb 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:30 compute-0 nova_compute[192567]: 2025-10-02 08:11:30.094 2 DEBUG oslo_concurrency.lockutils [req-1ad017af-2686-43a0-9e0d-6998cbe9811d req-9db75248-c49f-4531-b5a5-0f9db6674edb 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:30 compute-0 nova_compute[192567]: 2025-10-02 08:11:30.095 2 DEBUG nova.compute.manager [req-1ad017af-2686-43a0-9e0d-6998cbe9811d req-9db75248-c49f-4531-b5a5-0f9db6674edb 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] No waiting events found dispatching network-vif-plugged-11fd9ac4-a789-4053-a2ed-1bf04b861368 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:11:30 compute-0 nova_compute[192567]: 2025-10-02 08:11:30.096 2 WARNING nova.compute.manager [req-1ad017af-2686-43a0-9e0d-6998cbe9811d req-9db75248-c49f-4531-b5a5-0f9db6674edb 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Received unexpected event network-vif-plugged-11fd9ac4-a789-4053-a2ed-1bf04b861368 for instance with vm_state active and task_state None.
Oct 02 08:11:30 compute-0 nova_compute[192567]: 2025-10-02 08:11:30.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:31 compute-0 podman[215853]: 2025-10-02 08:11:31.192824455 +0000 UTC m=+0.098944564 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Oct 02 08:11:31 compute-0 openstack_network_exporter[205118]: ERROR   08:11:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:11:31 compute-0 openstack_network_exporter[205118]: ERROR   08:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:11:31 compute-0 openstack_network_exporter[205118]: ERROR   08:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:11:31 compute-0 openstack_network_exporter[205118]: ERROR   08:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:11:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:11:31 compute-0 openstack_network_exporter[205118]: ERROR   08:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:11:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:11:35 compute-0 nova_compute[192567]: 2025-10-02 08:11:35.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:35 compute-0 nova_compute[192567]: 2025-10-02 08:11:35.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:40 compute-0 nova_compute[192567]: 2025-10-02 08:11:40.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:40 compute-0 podman[215893]: 2025-10-02 08:11:40.178561923 +0000 UTC m=+0.077749328 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 02 08:11:40 compute-0 podman[215891]: 2025-10-02 08:11:40.185826928 +0000 UTC m=+0.086908921 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 02 08:11:40 compute-0 nova_compute[192567]: 2025-10-02 08:11:40.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:40 compute-0 podman[215892]: 2025-10-02 08:11:40.247658832 +0000 UTC m=+0.145606999 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:11:40 compute-0 ovn_controller[94821]: 2025-10-02T08:11:40Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:52:70:8c 10.100.0.12
Oct 02 08:11:40 compute-0 ovn_controller[94821]: 2025-10-02T08:11:40Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:70:8c 10.100.0.12
Oct 02 08:11:42 compute-0 podman[215955]: 2025-10-02 08:11:42.181714304 +0000 UTC m=+0.092721361 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=iscsid, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.618 2 DEBUG nova.virt.libvirt.driver [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Creating tmpfile /var/lib/nova/instances/tmp8tkgfpvi to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.724 2 DEBUG nova.compute.manager [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8tkgfpvi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.731 2 DEBUG nova.compute.manager [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.754 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.755 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.776 2 INFO nova.compute.rpcapi [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.777 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.804 2 DEBUG oslo_concurrency.lockutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.804 2 DEBUG oslo_concurrency.lockutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.841 2 DEBUG nova.objects.instance [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'pci_requests' on Instance uuid b4d496c6-fc60-476d-84fa-b8183df48147 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.859 2 DEBUG nova.virt.hardware [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.859 2 INFO nova.compute.claims [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.860 2 DEBUG nova.objects.instance [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'resources' on Instance uuid b4d496c6-fc60-476d-84fa-b8183df48147 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.873 2 DEBUG nova.objects.instance [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'numa_topology' on Instance uuid b4d496c6-fc60-476d-84fa-b8183df48147 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.887 2 DEBUG nova.objects.instance [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid b4d496c6-fc60-476d-84fa-b8183df48147 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.922 2 INFO nova.compute.resource_tracker [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Updating resource usage from migration 7eecafc5-1e44-4e5a-9eca-7217870e07df
Oct 02 08:11:42 compute-0 nova_compute[192567]: 2025-10-02 08:11:42.923 2 DEBUG nova.compute.resource_tracker [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Starting to track incoming migration 7eecafc5-1e44-4e5a-9eca-7217870e07df with flavor 932d352e-81e8-4137-94d3-19616d5c2ae2 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Oct 02 08:11:43 compute-0 nova_compute[192567]: 2025-10-02 08:11:43.076 2 DEBUG nova.compute.provider_tree [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:11:43 compute-0 nova_compute[192567]: 2025-10-02 08:11:43.107 2 DEBUG nova.scheduler.client.report [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:11:43 compute-0 nova_compute[192567]: 2025-10-02 08:11:43.139 2 DEBUG oslo_concurrency.lockutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:43 compute-0 nova_compute[192567]: 2025-10-02 08:11:43.140 2 INFO nova.compute.manager [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Migrating
Oct 02 08:11:45 compute-0 nova_compute[192567]: 2025-10-02 08:11:45.044 2 DEBUG nova.compute.manager [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8tkgfpvi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='99ec0256-cf67-4122-81b7-d0767c5a1347',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Oct 02 08:11:45 compute-0 nova_compute[192567]: 2025-10-02 08:11:45.067 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-99ec0256-cf67-4122-81b7-d0767c5a1347" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:11:45 compute-0 nova_compute[192567]: 2025-10-02 08:11:45.067 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-99ec0256-cf67-4122-81b7-d0767c5a1347" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:11:45 compute-0 nova_compute[192567]: 2025-10-02 08:11:45.068 2 DEBUG nova.network.neutron [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:11:45 compute-0 nova_compute[192567]: 2025-10-02 08:11:45.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:45 compute-0 nova_compute[192567]: 2025-10-02 08:11:45.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:45.968 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:45.969 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:45.970 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:46 compute-0 sshd-session[215975]: Accepted publickey for nova from 192.168.122.101 port 33234 ssh2: ECDSA SHA256:nyj9easCU2+zJyxXdAvgdE/0ePVxCLkFf7X2/rv3WZg
Oct 02 08:11:46 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Oct 02 08:11:46 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct 02 08:11:46 compute-0 systemd-logind[827]: New session 29 of user nova.
Oct 02 08:11:46 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct 02 08:11:46 compute-0 systemd[1]: Starting User Manager for UID 42436...
Oct 02 08:11:46 compute-0 podman[215977]: 2025-10-02 08:11:46.351577299 +0000 UTC m=+0.094834686 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:11:46 compute-0 systemd[215996]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:11:46 compute-0 sshd-session[215874]: Connection closed by 183.171.154.29 port 42724 [preauth]
Oct 02 08:11:46 compute-0 systemd[215996]: Queued start job for default target Main User Target.
Oct 02 08:11:46 compute-0 systemd[215996]: Created slice User Application Slice.
Oct 02 08:11:46 compute-0 systemd[215996]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 02 08:11:46 compute-0 systemd[215996]: Started Daily Cleanup of User's Temporary Directories.
Oct 02 08:11:46 compute-0 systemd[215996]: Reached target Paths.
Oct 02 08:11:46 compute-0 systemd[215996]: Reached target Timers.
Oct 02 08:11:46 compute-0 systemd[215996]: Starting D-Bus User Message Bus Socket...
Oct 02 08:11:46 compute-0 systemd[215996]: Starting Create User's Volatile Files and Directories...
Oct 02 08:11:46 compute-0 systemd[215996]: Listening on D-Bus User Message Bus Socket.
Oct 02 08:11:46 compute-0 systemd[215996]: Reached target Sockets.
Oct 02 08:11:46 compute-0 systemd[215996]: Finished Create User's Volatile Files and Directories.
Oct 02 08:11:46 compute-0 systemd[215996]: Reached target Basic System.
Oct 02 08:11:46 compute-0 systemd[1]: Started User Manager for UID 42436.
Oct 02 08:11:46 compute-0 systemd[215996]: Reached target Main User Target.
Oct 02 08:11:46 compute-0 systemd[215996]: Startup finished in 184ms.
Oct 02 08:11:46 compute-0 systemd[1]: Started Session 29 of User nova.
Oct 02 08:11:46 compute-0 sshd-session[215975]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:11:46 compute-0 sshd-session[216018]: Received disconnect from 192.168.122.101 port 33234:11: disconnected by user
Oct 02 08:11:46 compute-0 sshd-session[216018]: Disconnected from user nova 192.168.122.101 port 33234
Oct 02 08:11:46 compute-0 sshd-session[215975]: pam_unix(sshd:session): session closed for user nova
Oct 02 08:11:46 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Oct 02 08:11:46 compute-0 systemd-logind[827]: Session 29 logged out. Waiting for processes to exit.
Oct 02 08:11:46 compute-0 systemd-logind[827]: Removed session 29.
Oct 02 08:11:46 compute-0 sshd-session[216020]: Accepted publickey for nova from 192.168.122.101 port 33246 ssh2: ECDSA SHA256:nyj9easCU2+zJyxXdAvgdE/0ePVxCLkFf7X2/rv3WZg
Oct 02 08:11:46 compute-0 systemd-logind[827]: New session 31 of user nova.
Oct 02 08:11:46 compute-0 systemd[1]: Started Session 31 of User nova.
Oct 02 08:11:46 compute-0 sshd-session[216020]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:11:46 compute-0 sshd-session[216024]: Received disconnect from 192.168.122.101 port 33246:11: disconnected by user
Oct 02 08:11:46 compute-0 sshd-session[216024]: Disconnected from user nova 192.168.122.101 port 33246
Oct 02 08:11:46 compute-0 sshd-session[216020]: pam_unix(sshd:session): session closed for user nova
Oct 02 08:11:46 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Oct 02 08:11:46 compute-0 systemd-logind[827]: Session 31 logged out. Waiting for processes to exit.
Oct 02 08:11:46 compute-0 systemd-logind[827]: Removed session 31.
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.060 2 DEBUG nova.network.neutron [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Updating instance_info_cache with network_info: [{"id": "3c8fd5e6-225b-47a2-81df-d81b2d20fa8b", "address": "fa:16:3e:85:db:a8", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c8fd5e6-22", "ovs_interfaceid": "3c8fd5e6-225b-47a2-81df-d81b2d20fa8b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.081 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-99ec0256-cf67-4122-81b7-d0767c5a1347" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.084 2 DEBUG nova.virt.libvirt.driver [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8tkgfpvi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='99ec0256-cf67-4122-81b7-d0767c5a1347',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.085 2 DEBUG nova.virt.libvirt.driver [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Creating instance directory: /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.086 2 DEBUG nova.virt.libvirt.driver [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Creating disk.info with the contents: {'/var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk': 'qcow2', '/var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.087 2 DEBUG nova.virt.libvirt.driver [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.089 2 DEBUG nova.objects.instance [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 99ec0256-cf67-4122-81b7-d0767c5a1347 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.133 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.231 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.233 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.235 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.267 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.354 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.355 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.396 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.397 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.398 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.465 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.467 2 DEBUG nova.virt.disk.api [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.468 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.525 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.527 2 DEBUG nova.virt.disk.api [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.527 2 DEBUG nova.objects.instance [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 99ec0256-cf67-4122-81b7-d0767c5a1347 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.542 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.577 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk.config 485376" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.580 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk.config to /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Oct 02 08:11:47 compute-0 nova_compute[192567]: 2025-10-02 08:11:47.580 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk.config /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.060 2 DEBUG oslo_concurrency.processutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347/disk.config /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.061 2 DEBUG nova.virt.libvirt.driver [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.063 2 DEBUG nova.virt.libvirt.vif [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:10:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-697284282',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-697284282',id=3,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:10:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-vaul1op1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:10:32Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=99ec0256-cf67-4122-81b7-d0767c5a1347,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c8fd5e6-225b-47a2-81df-d81b2d20fa8b", "address": "fa:16:3e:85:db:a8", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3c8fd5e6-22", "ovs_interfaceid": "3c8fd5e6-225b-47a2-81df-d81b2d20fa8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.063 2 DEBUG nova.network.os_vif_util [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "3c8fd5e6-225b-47a2-81df-d81b2d20fa8b", "address": "fa:16:3e:85:db:a8", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3c8fd5e6-22", "ovs_interfaceid": "3c8fd5e6-225b-47a2-81df-d81b2d20fa8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.064 2 DEBUG nova.network.os_vif_util [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:db:a8,bridge_name='br-int',has_traffic_filtering=True,id=3c8fd5e6-225b-47a2-81df-d81b2d20fa8b,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c8fd5e6-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.065 2 DEBUG os_vif [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:db:a8,bridge_name='br-int',has_traffic_filtering=True,id=3c8fd5e6-225b-47a2-81df-d81b2d20fa8b,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c8fd5e6-22') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.066 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.067 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.070 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c8fd5e6-22, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.071 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3c8fd5e6-22, col_values=(('external_ids', {'iface-id': '3c8fd5e6-225b-47a2-81df-d81b2d20fa8b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:db:a8', 'vm-uuid': '99ec0256-cf67-4122-81b7-d0767c5a1347'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:48 compute-0 NetworkManager[51654]: <info>  [1759392708.0739] manager: (tap3c8fd5e6-22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.084 2 INFO os_vif [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:db:a8,bridge_name='br-int',has_traffic_filtering=True,id=3c8fd5e6-225b-47a2-81df-d81b2d20fa8b,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c8fd5e6-22')
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.085 2 DEBUG nova.virt.libvirt.driver [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Oct 02 08:11:48 compute-0 nova_compute[192567]: 2025-10-02 08:11:48.085 2 DEBUG nova.compute.manager [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8tkgfpvi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='99ec0256-cf67-4122-81b7-d0767c5a1347',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Oct 02 08:11:49 compute-0 nova_compute[192567]: 2025-10-02 08:11:49.799 2 DEBUG nova.network.neutron [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Port 3c8fd5e6-225b-47a2-81df-d81b2d20fa8b updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Oct 02 08:11:49 compute-0 nova_compute[192567]: 2025-10-02 08:11:49.802 2 DEBUG nova.compute.manager [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8tkgfpvi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='99ec0256-cf67-4122-81b7-d0767c5a1347',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Oct 02 08:11:49 compute-0 nova_compute[192567]: 2025-10-02 08:11:49.846 2 DEBUG nova.compute.manager [req-cdd198a6-e69d-4f02-808b-d4f50dc04f49 req-df6293c9-3578-4863-a220-5a2fe449fef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received event network-vif-unplugged-71aeead1-a439-4326-93bb-38c3281661f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:11:49 compute-0 nova_compute[192567]: 2025-10-02 08:11:49.847 2 DEBUG oslo_concurrency.lockutils [req-cdd198a6-e69d-4f02-808b-d4f50dc04f49 req-df6293c9-3578-4863-a220-5a2fe449fef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:49 compute-0 nova_compute[192567]: 2025-10-02 08:11:49.847 2 DEBUG oslo_concurrency.lockutils [req-cdd198a6-e69d-4f02-808b-d4f50dc04f49 req-df6293c9-3578-4863-a220-5a2fe449fef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:49 compute-0 nova_compute[192567]: 2025-10-02 08:11:49.848 2 DEBUG oslo_concurrency.lockutils [req-cdd198a6-e69d-4f02-808b-d4f50dc04f49 req-df6293c9-3578-4863-a220-5a2fe449fef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:49 compute-0 nova_compute[192567]: 2025-10-02 08:11:49.848 2 DEBUG nova.compute.manager [req-cdd198a6-e69d-4f02-808b-d4f50dc04f49 req-df6293c9-3578-4863-a220-5a2fe449fef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] No waiting events found dispatching network-vif-unplugged-71aeead1-a439-4326-93bb-38c3281661f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:11:49 compute-0 nova_compute[192567]: 2025-10-02 08:11:49.848 2 WARNING nova.compute.manager [req-cdd198a6-e69d-4f02-808b-d4f50dc04f49 req-df6293c9-3578-4863-a220-5a2fe449fef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received unexpected event network-vif-unplugged-71aeead1-a439-4326-93bb-38c3281661f2 for instance with vm_state active and task_state resize_migrating.
Oct 02 08:11:49 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 02 08:11:50 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 02 08:11:50 compute-0 kernel: tap3c8fd5e6-22: entered promiscuous mode
Oct 02 08:11:50 compute-0 sshd-session[216067]: Accepted publickey for nova from 192.168.122.101 port 33248 ssh2: ECDSA SHA256:nyj9easCU2+zJyxXdAvgdE/0ePVxCLkFf7X2/rv3WZg
Oct 02 08:11:50 compute-0 NetworkManager[51654]: <info>  [1759392710.2579] manager: (tap3c8fd5e6-22): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Oct 02 08:11:50 compute-0 ovn_controller[94821]: 2025-10-02T08:11:50Z|00041|binding|INFO|Claiming lport 3c8fd5e6-225b-47a2-81df-d81b2d20fa8b for this additional chassis.
Oct 02 08:11:50 compute-0 ovn_controller[94821]: 2025-10-02T08:11:50Z|00042|binding|INFO|3c8fd5e6-225b-47a2-81df-d81b2d20fa8b: Claiming fa:16:3e:85:db:a8 10.100.0.10
Oct 02 08:11:50 compute-0 nova_compute[192567]: 2025-10-02 08:11:50.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:50 compute-0 systemd-logind[827]: New session 32 of user nova.
Oct 02 08:11:50 compute-0 ovn_controller[94821]: 2025-10-02T08:11:50Z|00043|binding|INFO|Setting lport 3c8fd5e6-225b-47a2-81df-d81b2d20fa8b ovn-installed in OVS
Oct 02 08:11:50 compute-0 nova_compute[192567]: 2025-10-02 08:11:50.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:50 compute-0 nova_compute[192567]: 2025-10-02 08:11:50.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:50 compute-0 systemd[1]: Started Session 32 of User nova.
Oct 02 08:11:50 compute-0 sshd-session[216067]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:11:50 compute-0 systemd-machined[152597]: New machine qemu-4-instance-00000003.
Oct 02 08:11:50 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000003.
Oct 02 08:11:50 compute-0 systemd-udevd[216086]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:11:50 compute-0 NetworkManager[51654]: <info>  [1759392710.3765] device (tap3c8fd5e6-22): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:11:50 compute-0 NetworkManager[51654]: <info>  [1759392710.3782] device (tap3c8fd5e6-22): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:11:50 compute-0 sshd-session[216085]: Received disconnect from 192.168.122.101 port 33248:11: disconnected by user
Oct 02 08:11:50 compute-0 sshd-session[216085]: Disconnected from user nova 192.168.122.101 port 33248
Oct 02 08:11:50 compute-0 sshd-session[216067]: pam_unix(sshd:session): session closed for user nova
Oct 02 08:11:50 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Oct 02 08:11:50 compute-0 systemd-logind[827]: Session 32 logged out. Waiting for processes to exit.
Oct 02 08:11:50 compute-0 systemd-logind[827]: Removed session 32.
Oct 02 08:11:50 compute-0 sshd-session[216103]: Accepted publickey for nova from 192.168.122.101 port 33258 ssh2: ECDSA SHA256:nyj9easCU2+zJyxXdAvgdE/0ePVxCLkFf7X2/rv3WZg
Oct 02 08:11:50 compute-0 systemd-logind[827]: New session 33 of user nova.
Oct 02 08:11:51 compute-0 systemd[1]: Started Session 33 of User nova.
Oct 02 08:11:51 compute-0 sshd-session[216103]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:11:51 compute-0 sshd-session[216106]: Received disconnect from 192.168.122.101 port 33258:11: disconnected by user
Oct 02 08:11:51 compute-0 sshd-session[216106]: Disconnected from user nova 192.168.122.101 port 33258
Oct 02 08:11:51 compute-0 sshd-session[216103]: pam_unix(sshd:session): session closed for user nova
Oct 02 08:11:51 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Oct 02 08:11:51 compute-0 systemd-logind[827]: Session 33 logged out. Waiting for processes to exit.
Oct 02 08:11:51 compute-0 systemd-logind[827]: Removed session 33.
Oct 02 08:11:51 compute-0 sshd-session[216108]: Accepted publickey for nova from 192.168.122.101 port 33264 ssh2: ECDSA SHA256:nyj9easCU2+zJyxXdAvgdE/0ePVxCLkFf7X2/rv3WZg
Oct 02 08:11:51 compute-0 sshd-session[216023]: Invalid user centos from 117.175.160.58 port 37646
Oct 02 08:11:51 compute-0 sshd-session[216023]: pam_unix(sshd:auth): check pass; user unknown
Oct 02 08:11:51 compute-0 sshd-session[216023]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=117.175.160.58
Oct 02 08:11:51 compute-0 systemd-logind[827]: New session 34 of user nova.
Oct 02 08:11:51 compute-0 systemd[1]: Started Session 34 of User nova.
Oct 02 08:11:51 compute-0 sshd-session[216108]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:11:51 compute-0 sshd-session[216111]: Received disconnect from 192.168.122.101 port 33264:11: disconnected by user
Oct 02 08:11:51 compute-0 sshd-session[216111]: Disconnected from user nova 192.168.122.101 port 33264
Oct 02 08:11:51 compute-0 sshd-session[216108]: pam_unix(sshd:session): session closed for user nova
Oct 02 08:11:51 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Oct 02 08:11:51 compute-0 systemd-logind[827]: Session 34 logged out. Waiting for processes to exit.
Oct 02 08:11:51 compute-0 systemd-logind[827]: Removed session 34.
Oct 02 08:11:51 compute-0 nova_compute[192567]: 2025-10-02 08:11:51.517 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392711.5158465, 99ec0256-cf67-4122-81b7-d0767c5a1347 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:11:51 compute-0 nova_compute[192567]: 2025-10-02 08:11:51.519 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] VM Started (Lifecycle Event)
Oct 02 08:11:51 compute-0 nova_compute[192567]: 2025-10-02 08:11:51.538 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:11:51 compute-0 nova_compute[192567]: 2025-10-02 08:11:51.971 2 DEBUG nova.compute.manager [req-db1f09ed-a997-49a1-8af2-4b5b505ef6d2 req-4f361438-412a-4578-933d-347400e55157 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received event network-vif-plugged-71aeead1-a439-4326-93bb-38c3281661f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:11:51 compute-0 nova_compute[192567]: 2025-10-02 08:11:51.972 2 DEBUG oslo_concurrency.lockutils [req-db1f09ed-a997-49a1-8af2-4b5b505ef6d2 req-4f361438-412a-4578-933d-347400e55157 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:51 compute-0 nova_compute[192567]: 2025-10-02 08:11:51.972 2 DEBUG oslo_concurrency.lockutils [req-db1f09ed-a997-49a1-8af2-4b5b505ef6d2 req-4f361438-412a-4578-933d-347400e55157 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:51 compute-0 nova_compute[192567]: 2025-10-02 08:11:51.972 2 DEBUG oslo_concurrency.lockutils [req-db1f09ed-a997-49a1-8af2-4b5b505ef6d2 req-4f361438-412a-4578-933d-347400e55157 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:51 compute-0 nova_compute[192567]: 2025-10-02 08:11:51.972 2 DEBUG nova.compute.manager [req-db1f09ed-a997-49a1-8af2-4b5b505ef6d2 req-4f361438-412a-4578-933d-347400e55157 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] No waiting events found dispatching network-vif-plugged-71aeead1-a439-4326-93bb-38c3281661f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:11:51 compute-0 nova_compute[192567]: 2025-10-02 08:11:51.973 2 WARNING nova.compute.manager [req-db1f09ed-a997-49a1-8af2-4b5b505ef6d2 req-4f361438-412a-4578-933d-347400e55157 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received unexpected event network-vif-plugged-71aeead1-a439-4326-93bb-38c3281661f2 for instance with vm_state active and task_state resize_migrated.
Oct 02 08:11:52 compute-0 nova_compute[192567]: 2025-10-02 08:11:52.134 2 INFO nova.network.neutron [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Updating port 71aeead1-a439-4326-93bb-38c3281661f2 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Oct 02 08:11:52 compute-0 nova_compute[192567]: 2025-10-02 08:11:52.284 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392712.283657, 99ec0256-cf67-4122-81b7-d0767c5a1347 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:11:52 compute-0 nova_compute[192567]: 2025-10-02 08:11:52.285 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] VM Resumed (Lifecycle Event)
Oct 02 08:11:52 compute-0 nova_compute[192567]: 2025-10-02 08:11:52.323 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:11:52 compute-0 nova_compute[192567]: 2025-10-02 08:11:52.326 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:11:52 compute-0 nova_compute[192567]: 2025-10-02 08:11:52.360 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Oct 02 08:11:52 compute-0 sshd-session[216023]: Failed password for invalid user centos from 117.175.160.58 port 37646 ssh2
Oct 02 08:11:53 compute-0 nova_compute[192567]: 2025-10-02 08:11:53.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:53 compute-0 ovn_controller[94821]: 2025-10-02T08:11:53Z|00044|binding|INFO|Claiming lport 3c8fd5e6-225b-47a2-81df-d81b2d20fa8b for this chassis.
Oct 02 08:11:53 compute-0 ovn_controller[94821]: 2025-10-02T08:11:53Z|00045|binding|INFO|3c8fd5e6-225b-47a2-81df-d81b2d20fa8b: Claiming fa:16:3e:85:db:a8 10.100.0.10
Oct 02 08:11:53 compute-0 ovn_controller[94821]: 2025-10-02T08:11:53Z|00046|binding|INFO|Setting lport 3c8fd5e6-225b-47a2-81df-d81b2d20fa8b up in Southbound
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.402 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:db:a8 10.100.0.10'], port_security=['fa:16:3e:85:db:a8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '99ec0256-cf67-4122-81b7-d0767c5a1347', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5d6400b4e3f4d98a7456330f6429bd5', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'cba96dbc-c401-4d81-b355-4680d6ad5e15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d06b6f4b-ccde-4903-a1fe-e6bac9f52057, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=3c8fd5e6-225b-47a2-81df-d81b2d20fa8b) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.404 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 3c8fd5e6-225b-47a2-81df-d81b2d20fa8b in datapath 441198e3-04ff-48aa-b8a7-2339e4bb8085 bound to our chassis
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.406 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 441198e3-04ff-48aa-b8a7-2339e4bb8085
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.430 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[bf85f725-2a29-4d04-b02e-4cc5aaacb77f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.470 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[d6bae3db-63ab-43bd-aed7-765e0607d7e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.475 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[8c284f15-8fce-4136-ba3f-f427c83a1f8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:53 compute-0 sshd-session[216023]: Connection closed by invalid user centos 117.175.160.58 port 37646 [preauth]
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.518 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[aafc3d87-58c8-4b96-843a-2a9adecbc2b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.541 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[7fdfd8e3-1d72-4b39-8218-7b8fc731eeb1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap441198e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:13:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 19, 'tx_packets': 9, 'rx_bytes': 1294, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 19, 'tx_packets': 9, 'rx_bytes': 1294, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348164, 'reachable_time': 44332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216131, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.563 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[d68fb925-b3ae-448c-aa87-04e77b81d9d9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348180, 'tstamp': 348180}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216132, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348183, 'tstamp': 348183}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216132, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.565 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap441198e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:53 compute-0 nova_compute[192567]: 2025-10-02 08:11:53.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:53 compute-0 nova_compute[192567]: 2025-10-02 08:11:53.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.570 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap441198e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.570 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.571 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap441198e3-00, col_values=(('external_ids', {'iface-id': 'f4e4745f-6cb7-4dfe-930a-ab5c5f2db11b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:53.572 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:11:53 compute-0 nova_compute[192567]: 2025-10-02 08:11:53.642 2 DEBUG oslo_concurrency.lockutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-b4d496c6-fc60-476d-84fa-b8183df48147" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:11:53 compute-0 nova_compute[192567]: 2025-10-02 08:11:53.643 2 DEBUG oslo_concurrency.lockutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-b4d496c6-fc60-476d-84fa-b8183df48147" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:11:53 compute-0 nova_compute[192567]: 2025-10-02 08:11:53.643 2 DEBUG nova.network.neutron [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:11:53 compute-0 nova_compute[192567]: 2025-10-02 08:11:53.650 2 INFO nova.compute.manager [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Post operation of migration started
Oct 02 08:11:53 compute-0 nova_compute[192567]: 2025-10-02 08:11:53.975 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-99ec0256-cf67-4122-81b7-d0767c5a1347" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:11:53 compute-0 nova_compute[192567]: 2025-10-02 08:11:53.976 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-99ec0256-cf67-4122-81b7-d0767c5a1347" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:11:53 compute-0 nova_compute[192567]: 2025-10-02 08:11:53.976 2 DEBUG nova.network.neutron [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.149 2 DEBUG nova.network.neutron [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Updating instance_info_cache with network_info: [{"id": "71aeead1-a439-4326-93bb-38c3281661f2", "address": "fa:16:3e:bb:9b:fd", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71aeead1-a4", "ovs_interfaceid": "71aeead1-a439-4326-93bb-38c3281661f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.168 2 DEBUG oslo_concurrency.lockutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-b4d496c6-fc60-476d-84fa-b8183df48147" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.409 2 DEBUG nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.412 2 DEBUG nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.412 2 INFO nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Creating image(s)
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.414 2 DEBUG nova.objects.instance [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b4d496c6-fc60-476d-84fa-b8183df48147 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.427 2 DEBUG oslo_concurrency.processutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.454 2 DEBUG nova.compute.manager [req-9cf0f798-e6b1-4add-9911-bfe7c184969f req-5a2a2a2f-c459-4cfb-b4ae-8e4af04e668b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received event network-changed-71aeead1-a439-4326-93bb-38c3281661f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.455 2 DEBUG nova.compute.manager [req-9cf0f798-e6b1-4add-9911-bfe7c184969f req-5a2a2a2f-c459-4cfb-b4ae-8e4af04e668b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Refreshing instance network info cache due to event network-changed-71aeead1-a439-4326-93bb-38c3281661f2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.455 2 DEBUG oslo_concurrency.lockutils [req-9cf0f798-e6b1-4add-9911-bfe7c184969f req-5a2a2a2f-c459-4cfb-b4ae-8e4af04e668b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-b4d496c6-fc60-476d-84fa-b8183df48147" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.456 2 DEBUG oslo_concurrency.lockutils [req-9cf0f798-e6b1-4add-9911-bfe7c184969f req-5a2a2a2f-c459-4cfb-b4ae-8e4af04e668b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-b4d496c6-fc60-476d-84fa-b8183df48147" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.456 2 DEBUG nova.network.neutron [req-9cf0f798-e6b1-4add-9911-bfe7c184969f req-5a2a2a2f-c459-4cfb-b4ae-8e4af04e668b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Refreshing network info cache for port 71aeead1-a439-4326-93bb-38c3281661f2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.511 2 DEBUG oslo_concurrency.processutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.512 2 DEBUG nova.virt.disk.api [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.513 2 DEBUG oslo_concurrency.processutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.568 2 DEBUG oslo_concurrency.processutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.569 2 DEBUG nova.virt.disk.api [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.584 2 DEBUG nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.584 2 DEBUG nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Ensure instance console log exists: /var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.585 2 DEBUG oslo_concurrency.lockutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.586 2 DEBUG oslo_concurrency.lockutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.586 2 DEBUG oslo_concurrency.lockutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.590 2 DEBUG nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Start _get_guest_xml network_info=[{"id": "71aeead1-a439-4326-93bb-38c3281661f2", "address": "fa:16:3e:bb:9b:fd", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "vif_mac": "fa:16:3e:bb:9b:fd"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71aeead1-a4", "ovs_interfaceid": "71aeead1-a439-4326-93bb-38c3281661f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.595 2 WARNING nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.601 2 DEBUG nova.virt.libvirt.host [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.602 2 DEBUG nova.virt.libvirt.host [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.605 2 DEBUG nova.virt.libvirt.host [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.606 2 DEBUG nova.virt.libvirt.host [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.607 2 DEBUG nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.607 2 DEBUG nova.virt.hardware [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.608 2 DEBUG nova.virt.hardware [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.608 2 DEBUG nova.virt.hardware [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.608 2 DEBUG nova.virt.hardware [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.609 2 DEBUG nova.virt.hardware [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.609 2 DEBUG nova.virt.hardware [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.609 2 DEBUG nova.virt.hardware [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.610 2 DEBUG nova.virt.hardware [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.610 2 DEBUG nova.virt.hardware [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.610 2 DEBUG nova.virt.hardware [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.611 2 DEBUG nova.virt.hardware [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.611 2 DEBUG nova.objects.instance [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b4d496c6-fc60-476d-84fa-b8183df48147 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.628 2 DEBUG oslo_concurrency.processutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.725 2 DEBUG oslo_concurrency.processutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/disk.config --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.726 2 DEBUG oslo_concurrency.lockutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "/var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.727 2 DEBUG oslo_concurrency.lockutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "/var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.728 2 DEBUG oslo_concurrency.lockutils [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "/var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.729 2 DEBUG nova.virt.libvirt.vif [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:11:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1549390973',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1549390973',id=5,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:11:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-0i8hs3xf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:11:51Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=b4d496c6-fc60-476d-84fa-b8183df48147,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71aeead1-a439-4326-93bb-38c3281661f2", "address": "fa:16:3e:bb:9b:fd", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "vif_mac": "fa:16:3e:bb:9b:fd"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71aeead1-a4", "ovs_interfaceid": "71aeead1-a439-4326-93bb-38c3281661f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.729 2 DEBUG nova.network.os_vif_util [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "71aeead1-a439-4326-93bb-38c3281661f2", "address": "fa:16:3e:bb:9b:fd", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "vif_mac": "fa:16:3e:bb:9b:fd"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71aeead1-a4", "ovs_interfaceid": "71aeead1-a439-4326-93bb-38c3281661f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.730 2 DEBUG nova.network.os_vif_util [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=71aeead1-a439-4326-93bb-38c3281661f2,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71aeead1-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.733 2 DEBUG nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:11:55 compute-0 nova_compute[192567]:   <uuid>b4d496c6-fc60-476d-84fa-b8183df48147</uuid>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   <name>instance-00000005</name>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1549390973</nova:name>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:11:55</nova:creationTime>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:11:55 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:11:55 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:11:55 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:11:55 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:11:55 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:11:55 compute-0 nova_compute[192567]:         <nova:user uuid="4b5c71b386a34e829eef47bf613d813c">tempest-TestExecuteActionsViaActuator-547955480-project-admin</nova:user>
Oct 02 08:11:55 compute-0 nova_compute[192567]:         <nova:project uuid="a5d6400b4e3f4d98a7456330f6429bd5">tempest-TestExecuteActionsViaActuator-547955480</nova:project>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:11:55 compute-0 nova_compute[192567]:         <nova:port uuid="71aeead1-a439-4326-93bb-38c3281661f2">
Oct 02 08:11:55 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <system>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <entry name="serial">b4d496c6-fc60-476d-84fa-b8183df48147</entry>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <entry name="uuid">b4d496c6-fc60-476d-84fa-b8183df48147</entry>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     </system>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   <os>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   </os>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   <features>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   </features>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/disk"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/disk.config"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:bb:9b:fd"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <target dev="tap71aeead1-a4"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147/console.log" append="off"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <video>
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     </video>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:11:55 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:11:55 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:11:55 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:11:55 compute-0 nova_compute[192567]: </domain>
Oct 02 08:11:55 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.735 2 DEBUG nova.virt.libvirt.vif [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:11:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1549390973',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1549390973',id=5,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:11:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-0i8hs3xf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:11:51Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=b4d496c6-fc60-476d-84fa-b8183df48147,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71aeead1-a439-4326-93bb-38c3281661f2", "address": "fa:16:3e:bb:9b:fd", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "vif_mac": "fa:16:3e:bb:9b:fd"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71aeead1-a4", "ovs_interfaceid": "71aeead1-a439-4326-93bb-38c3281661f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.736 2 DEBUG nova.network.os_vif_util [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "71aeead1-a439-4326-93bb-38c3281661f2", "address": "fa:16:3e:bb:9b:fd", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "vif_mac": "fa:16:3e:bb:9b:fd"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71aeead1-a4", "ovs_interfaceid": "71aeead1-a439-4326-93bb-38c3281661f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.736 2 DEBUG nova.network.os_vif_util [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=71aeead1-a439-4326-93bb-38c3281661f2,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71aeead1-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.737 2 DEBUG os_vif [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=71aeead1-a439-4326-93bb-38c3281661f2,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71aeead1-a4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.738 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.739 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.742 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71aeead1-a4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.743 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap71aeead1-a4, col_values=(('external_ids', {'iface-id': '71aeead1-a439-4326-93bb-38c3281661f2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:9b:fd', 'vm-uuid': 'b4d496c6-fc60-476d-84fa-b8183df48147'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:55 compute-0 NetworkManager[51654]: <info>  [1759392715.7464] manager: (tap71aeead1-a4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.755 2 INFO os_vif [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=71aeead1-a439-4326-93bb-38c3281661f2,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71aeead1-a4')
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.820 2 DEBUG nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.821 2 DEBUG nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.821 2 DEBUG nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No VIF found with MAC fa:16:3e:bb:9b:fd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.822 2 INFO nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Using config drive
Oct 02 08:11:55 compute-0 kernel: tap71aeead1-a4: entered promiscuous mode
Oct 02 08:11:55 compute-0 NetworkManager[51654]: <info>  [1759392715.8971] manager: (tap71aeead1-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Oct 02 08:11:55 compute-0 ovn_controller[94821]: 2025-10-02T08:11:55Z|00047|binding|INFO|Claiming lport 71aeead1-a439-4326-93bb-38c3281661f2 for this chassis.
Oct 02 08:11:55 compute-0 ovn_controller[94821]: 2025-10-02T08:11:55Z|00048|binding|INFO|71aeead1-a439-4326-93bb-38c3281661f2: Claiming fa:16:3e:bb:9b:fd 10.100.0.13
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:55.908 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:9b:fd 10.100.0.13'], port_security=['fa:16:3e:bb:9b:fd 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'b4d496c6-fc60-476d-84fa-b8183df48147', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5d6400b4e3f4d98a7456330f6429bd5', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'cba96dbc-c401-4d81-b355-4680d6ad5e15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d06b6f4b-ccde-4903-a1fe-e6bac9f52057, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=71aeead1-a439-4326-93bb-38c3281661f2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:11:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:55.910 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 71aeead1-a439-4326-93bb-38c3281661f2 in datapath 441198e3-04ff-48aa-b8a7-2339e4bb8085 bound to our chassis
Oct 02 08:11:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:55.911 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 441198e3-04ff-48aa-b8a7-2339e4bb8085
Oct 02 08:11:55 compute-0 ovn_controller[94821]: 2025-10-02T08:11:55Z|00049|binding|INFO|Setting lport 71aeead1-a439-4326-93bb-38c3281661f2 ovn-installed in OVS
Oct 02 08:11:55 compute-0 ovn_controller[94821]: 2025-10-02T08:11:55Z|00050|binding|INFO|Setting lport 71aeead1-a439-4326-93bb-38c3281661f2 up in Southbound
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:55 compute-0 nova_compute[192567]: 2025-10-02 08:11:55.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:55.942 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[a736058a-b0e3-4802-9a0f-e751f74212e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:55 compute-0 systemd-machined[152597]: New machine qemu-5-instance-00000005.
Oct 02 08:11:55 compute-0 systemd-udevd[216161]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:11:55 compute-0 NetworkManager[51654]: <info>  [1759392715.9770] device (tap71aeead1-a4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:11:55 compute-0 NetworkManager[51654]: <info>  [1759392715.9790] device (tap71aeead1-a4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:11:55 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Oct 02 08:11:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:55.988 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[0676ab71-62f9-41a4-92f2-5410ab73e3fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:55.992 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[8c279d31-3d4c-438b-aa85-a0b09d450f78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:56.034 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[153f6d31-d1f5-489b-96a4-b4d9516d0a5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.040 2 DEBUG nova.network.neutron [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Updating instance_info_cache with network_info: [{"id": "3c8fd5e6-225b-47a2-81df-d81b2d20fa8b", "address": "fa:16:3e:85:db:a8", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c8fd5e6-22", "ovs_interfaceid": "3c8fd5e6-225b-47a2-81df-d81b2d20fa8b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:11:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:56.063 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[643fe39c-b522-453a-a786-2d3812ee7a27]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap441198e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:13:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 19, 'tx_packets': 11, 'rx_bytes': 1294, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 19, 'tx_packets': 11, 'rx_bytes': 1294, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348164, 'reachable_time': 44332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216170, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.070 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-99ec0256-cf67-4122-81b7-d0767c5a1347" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:11:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:56.089 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[eb894582-9393-4929-a202-acf40027291c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348180, 'tstamp': 348180}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216173, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348183, 'tstamp': 348183}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216173, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:11:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:56.091 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap441198e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:11:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:56.096 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap441198e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:56.097 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:11:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:56.097 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap441198e3-00, col_values=(('external_ids', {'iface-id': 'f4e4745f-6cb7-4dfe-930a-ab5c5f2db11b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:11:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:11:56.098 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.105 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.106 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.106 2 DEBUG oslo_concurrency.lockutils [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.112 2 INFO nova.virt.libvirt.driver [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 02 08:11:56 compute-0 virtqemud[192112]: Domain id=4 name='instance-00000003' uuid=99ec0256-cf67-4122-81b7-d0767c5a1347 is tainted: custom-monitor
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.200 2 DEBUG nova.compute.manager [req-3069bfeb-ad6c-4702-8b34-9a52bfa4b228 req-e9f63d7f-2e3c-49d9-bb6f-405219e65d63 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received event network-vif-plugged-71aeead1-a439-4326-93bb-38c3281661f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.201 2 DEBUG oslo_concurrency.lockutils [req-3069bfeb-ad6c-4702-8b34-9a52bfa4b228 req-e9f63d7f-2e3c-49d9-bb6f-405219e65d63 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.202 2 DEBUG oslo_concurrency.lockutils [req-3069bfeb-ad6c-4702-8b34-9a52bfa4b228 req-e9f63d7f-2e3c-49d9-bb6f-405219e65d63 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.202 2 DEBUG oslo_concurrency.lockutils [req-3069bfeb-ad6c-4702-8b34-9a52bfa4b228 req-e9f63d7f-2e3c-49d9-bb6f-405219e65d63 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.202 2 DEBUG nova.compute.manager [req-3069bfeb-ad6c-4702-8b34-9a52bfa4b228 req-e9f63d7f-2e3c-49d9-bb6f-405219e65d63 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] No waiting events found dispatching network-vif-plugged-71aeead1-a439-4326-93bb-38c3281661f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.203 2 WARNING nova.compute.manager [req-3069bfeb-ad6c-4702-8b34-9a52bfa4b228 req-e9f63d7f-2e3c-49d9-bb6f-405219e65d63 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received unexpected event network-vif-plugged-71aeead1-a439-4326-93bb-38c3281661f2 for instance with vm_state active and task_state resize_finish.
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.931 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392716.9310992, b4d496c6-fc60-476d-84fa-b8183df48147 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.933 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] VM Resumed (Lifecycle Event)
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.936 2 DEBUG nova.compute.manager [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.944 2 INFO nova.virt.libvirt.driver [-] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Instance running successfully.
Oct 02 08:11:56 compute-0 virtqemud[192112]: argument unsupported: QEMU guest agent is not configured
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.947 2 DEBUG nova.virt.libvirt.guest [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.948 2 DEBUG nova.virt.libvirt.driver [None req-8218146f-98d2-436d-aa16-88a8fce246ac f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.957 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:11:56 compute-0 nova_compute[192567]: 2025-10-02 08:11:56.962 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:11:57 compute-0 nova_compute[192567]: 2025-10-02 08:11:57.008 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] During sync_power_state the instance has a pending task (resize_finish). Skip.
Oct 02 08:11:57 compute-0 nova_compute[192567]: 2025-10-02 08:11:57.009 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392716.9314466, b4d496c6-fc60-476d-84fa-b8183df48147 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:11:57 compute-0 nova_compute[192567]: 2025-10-02 08:11:57.010 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] VM Started (Lifecycle Event)
Oct 02 08:11:57 compute-0 nova_compute[192567]: 2025-10-02 08:11:57.030 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:11:57 compute-0 nova_compute[192567]: 2025-10-02 08:11:57.034 2 DEBUG nova.network.neutron [req-9cf0f798-e6b1-4add-9911-bfe7c184969f req-5a2a2a2f-c459-4cfb-b4ae-8e4af04e668b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Updated VIF entry in instance network info cache for port 71aeead1-a439-4326-93bb-38c3281661f2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:11:57 compute-0 nova_compute[192567]: 2025-10-02 08:11:57.035 2 DEBUG nova.network.neutron [req-9cf0f798-e6b1-4add-9911-bfe7c184969f req-5a2a2a2f-c459-4cfb-b4ae-8e4af04e668b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Updating instance_info_cache with network_info: [{"id": "71aeead1-a439-4326-93bb-38c3281661f2", "address": "fa:16:3e:bb:9b:fd", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71aeead1-a4", "ovs_interfaceid": "71aeead1-a439-4326-93bb-38c3281661f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:11:57 compute-0 nova_compute[192567]: 2025-10-02 08:11:57.057 2 DEBUG oslo_concurrency.lockutils [req-9cf0f798-e6b1-4add-9911-bfe7c184969f req-5a2a2a2f-c459-4cfb-b4ae-8e4af04e668b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-b4d496c6-fc60-476d-84fa-b8183df48147" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:11:57 compute-0 nova_compute[192567]: 2025-10-02 08:11:57.059 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:11:57 compute-0 nova_compute[192567]: 2025-10-02 08:11:57.122 2 INFO nova.virt.libvirt.driver [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 02 08:11:58 compute-0 nova_compute[192567]: 2025-10-02 08:11:58.129 2 INFO nova.virt.libvirt.driver [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 02 08:11:58 compute-0 nova_compute[192567]: 2025-10-02 08:11:58.137 2 DEBUG nova.compute.manager [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:11:58 compute-0 nova_compute[192567]: 2025-10-02 08:11:58.163 2 DEBUG nova.objects.instance [None req-2ad7ce08-337d-4972-94aa-bc0968c50ffb f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Oct 02 08:11:58 compute-0 nova_compute[192567]: 2025-10-02 08:11:58.302 2 DEBUG nova.compute.manager [req-1f45eae9-2694-4603-a8fb-c58ff0eb904e req-33921bde-e9a7-423d-9a1c-0a926893ffcb 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received event network-vif-plugged-71aeead1-a439-4326-93bb-38c3281661f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:11:58 compute-0 nova_compute[192567]: 2025-10-02 08:11:58.302 2 DEBUG oslo_concurrency.lockutils [req-1f45eae9-2694-4603-a8fb-c58ff0eb904e req-33921bde-e9a7-423d-9a1c-0a926893ffcb 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:58 compute-0 nova_compute[192567]: 2025-10-02 08:11:58.303 2 DEBUG oslo_concurrency.lockutils [req-1f45eae9-2694-4603-a8fb-c58ff0eb904e req-33921bde-e9a7-423d-9a1c-0a926893ffcb 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:58 compute-0 nova_compute[192567]: 2025-10-02 08:11:58.304 2 DEBUG oslo_concurrency.lockutils [req-1f45eae9-2694-4603-a8fb-c58ff0eb904e req-33921bde-e9a7-423d-9a1c-0a926893ffcb 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:11:58 compute-0 nova_compute[192567]: 2025-10-02 08:11:58.304 2 DEBUG nova.compute.manager [req-1f45eae9-2694-4603-a8fb-c58ff0eb904e req-33921bde-e9a7-423d-9a1c-0a926893ffcb 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] No waiting events found dispatching network-vif-plugged-71aeead1-a439-4326-93bb-38c3281661f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:11:58 compute-0 nova_compute[192567]: 2025-10-02 08:11:58.305 2 WARNING nova.compute.manager [req-1f45eae9-2694-4603-a8fb-c58ff0eb904e req-33921bde-e9a7-423d-9a1c-0a926893ffcb 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received unexpected event network-vif-plugged-71aeead1-a439-4326-93bb-38c3281661f2 for instance with vm_state resized and task_state None.
Oct 02 08:11:58 compute-0 unix_chkpwd[216185]: password check failed for user (root)
Oct 02 08:11:58 compute-0 sshd-session[216183]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 02 08:11:59 compute-0 podman[203011]: time="2025-10-02T08:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:11:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:11:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3459 "" "Go-http-client/1.1"
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.923 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.953 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Triggering sync for uuid 2e661e5f-2462-4ffd-99a7-afc83d45f425 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.954 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Triggering sync for uuid 99ec0256-cf67-4122-81b7-d0767c5a1347 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.954 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Triggering sync for uuid f13a8d11-bf67-4548-81bb-3bfd210a0471 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.954 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Triggering sync for uuid b4d496c6-fc60-476d-84fa-b8183df48147 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.955 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Triggering sync for uuid ed125c83-0f73-41e4-925c-db2354932843 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.955 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "2e661e5f-2462-4ffd-99a7-afc83d45f425" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.955 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.956 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "99ec0256-cf67-4122-81b7-d0767c5a1347" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.956 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "99ec0256-cf67-4122-81b7-d0767c5a1347" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.956 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "f13a8d11-bf67-4548-81bb-3bfd210a0471" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.957 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.957 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "b4d496c6-fc60-476d-84fa-b8183df48147" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.957 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "b4d496c6-fc60-476d-84fa-b8183df48147" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.958 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "ed125c83-0f73-41e4-925c-db2354932843" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:11:59 compute-0 nova_compute[192567]: 2025-10-02 08:11:59.958 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "ed125c83-0f73-41e4-925c-db2354932843" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:00 compute-0 nova_compute[192567]: 2025-10-02 08:12:00.005 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "99ec0256-cf67-4122-81b7-d0767c5a1347" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:00 compute-0 nova_compute[192567]: 2025-10-02 08:12:00.007 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:00 compute-0 nova_compute[192567]: 2025-10-02 08:12:00.013 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "b4d496c6-fc60-476d-84fa-b8183df48147" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:00 compute-0 nova_compute[192567]: 2025-10-02 08:12:00.022 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "ed125c83-0f73-41e4-925c-db2354932843" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:00 compute-0 nova_compute[192567]: 2025-10-02 08:12:00.036 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:00 compute-0 nova_compute[192567]: 2025-10-02 08:12:00.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:00 compute-0 sshd-session[216183]: Failed password for root from 91.224.92.32 port 50908 ssh2
Oct 02 08:12:00 compute-0 nova_compute[192567]: 2025-10-02 08:12:00.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:00 compute-0 unix_chkpwd[216187]: password check failed for user (root)
Oct 02 08:12:01 compute-0 openstack_network_exporter[205118]: ERROR   08:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:12:01 compute-0 openstack_network_exporter[205118]: ERROR   08:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:12:01 compute-0 openstack_network_exporter[205118]: ERROR   08:12:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:12:01 compute-0 openstack_network_exporter[205118]: ERROR   08:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:12:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:12:01 compute-0 openstack_network_exporter[205118]: ERROR   08:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:12:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:12:01 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Oct 02 08:12:01 compute-0 systemd[215996]: Activating special unit Exit the Session...
Oct 02 08:12:01 compute-0 systemd[215996]: Stopped target Main User Target.
Oct 02 08:12:01 compute-0 systemd[215996]: Stopped target Basic System.
Oct 02 08:12:01 compute-0 systemd[215996]: Stopped target Paths.
Oct 02 08:12:01 compute-0 systemd[215996]: Stopped target Sockets.
Oct 02 08:12:01 compute-0 systemd[215996]: Stopped target Timers.
Oct 02 08:12:01 compute-0 systemd[215996]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 02 08:12:01 compute-0 systemd[215996]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 02 08:12:01 compute-0 systemd[215996]: Closed D-Bus User Message Bus Socket.
Oct 02 08:12:01 compute-0 systemd[215996]: Stopped Create User's Volatile Files and Directories.
Oct 02 08:12:01 compute-0 systemd[215996]: Removed slice User Application Slice.
Oct 02 08:12:01 compute-0 systemd[215996]: Reached target Shutdown.
Oct 02 08:12:01 compute-0 systemd[215996]: Finished Exit the Session.
Oct 02 08:12:01 compute-0 systemd[215996]: Reached target Exit the Session.
Oct 02 08:12:01 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Oct 02 08:12:01 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Oct 02 08:12:01 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct 02 08:12:01 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct 02 08:12:01 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct 02 08:12:01 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct 02 08:12:01 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Oct 02 08:12:01 compute-0 podman[216188]: 2025-10-02 08:12:01.719899151 +0000 UTC m=+0.121961896 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, architecture=x86_64)
Oct 02 08:12:03 compute-0 sshd-session[216183]: Failed password for root from 91.224.92.32 port 50908 ssh2
Oct 02 08:12:04 compute-0 unix_chkpwd[216224]: password check failed for user (root)
Oct 02 08:12:05 compute-0 nova_compute[192567]: 2025-10-02 08:12:05.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:05 compute-0 nova_compute[192567]: 2025-10-02 08:12:05.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:06 compute-0 sshd-session[216183]: Failed password for root from 91.224.92.32 port 50908 ssh2
Oct 02 08:12:07 compute-0 sshd-session[216183]: Received disconnect from 91.224.92.32 port 50908:11:  [preauth]
Oct 02 08:12:07 compute-0 sshd-session[216183]: Disconnected from authenticating user root 91.224.92.32 port 50908 [preauth]
Oct 02 08:12:07 compute-0 sshd-session[216183]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 02 08:12:07 compute-0 unix_chkpwd[216240]: password check failed for user (root)
Oct 02 08:12:07 compute-0 sshd-session[216238]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 02 08:12:08 compute-0 ovn_controller[94821]: 2025-10-02T08:12:08Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bb:9b:fd 10.100.0.13
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.217 2 DEBUG oslo_concurrency.lockutils [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "ed125c83-0f73-41e4-925c-db2354932843" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.218 2 DEBUG oslo_concurrency.lockutils [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.218 2 DEBUG oslo_concurrency.lockutils [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "ed125c83-0f73-41e4-925c-db2354932843-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.218 2 DEBUG oslo_concurrency.lockutils [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.218 2 DEBUG oslo_concurrency.lockutils [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.220 2 INFO nova.compute.manager [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Terminating instance
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.220 2 DEBUG nova.compute.manager [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:12:09 compute-0 kernel: tap11fd9ac4-a7 (unregistering): left promiscuous mode
Oct 02 08:12:09 compute-0 NetworkManager[51654]: <info>  [1759392729.2487] device (tap11fd9ac4-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:12:09 compute-0 ovn_controller[94821]: 2025-10-02T08:12:09Z|00051|binding|INFO|Releasing lport 11fd9ac4-a789-4053-a2ed-1bf04b861368 from this chassis (sb_readonly=0)
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:09 compute-0 ovn_controller[94821]: 2025-10-02T08:12:09Z|00052|binding|INFO|Setting lport 11fd9ac4-a789-4053-a2ed-1bf04b861368 down in Southbound
Oct 02 08:12:09 compute-0 ovn_controller[94821]: 2025-10-02T08:12:09Z|00053|binding|INFO|Removing iface tap11fd9ac4-a7 ovn-installed in OVS
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.267 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:70:8c 10.100.0.12'], port_security=['fa:16:3e:52:70:8c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ed125c83-0f73-41e4-925c-db2354932843', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5d6400b4e3f4d98a7456330f6429bd5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cba96dbc-c401-4d81-b355-4680d6ad5e15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d06b6f4b-ccde-4903-a1fe-e6bac9f52057, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=11fd9ac4-a789-4053-a2ed-1bf04b861368) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.269 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 11fd9ac4-a789-4053-a2ed-1bf04b861368 in datapath 441198e3-04ff-48aa-b8a7-2339e4bb8085 unbound from our chassis
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.271 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 441198e3-04ff-48aa-b8a7-2339e4bb8085
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.291 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[c36340ce-802a-4fe1-aa9f-24dee5e3139a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:09 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Deactivated successfully.
Oct 02 08:12:09 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Consumed 14.268s CPU time.
Oct 02 08:12:09 compute-0 systemd-machined[152597]: Machine qemu-3-instance-00000006 terminated.
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.337 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[0990737d-9b98-42fe-a951-311b334381ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.340 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[bac93a74-f6aa-4dfc-bde0-aa1034af2318]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.372 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[04ca0e38-220f-49c1-9eef-7d5f874b08cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.391 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[c10f1d3c-6272-4763-af2c-7fdd3c7df8dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap441198e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:13:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 34, 'tx_packets': 13, 'rx_bytes': 1924, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 34, 'tx_packets': 13, 'rx_bytes': 1924, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348164, 'reachable_time': 44332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216252, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.407 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e47e3e-bcdb-464b-89d6-7bc6bfe021eb]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348180, 'tstamp': 348180}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216253, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348183, 'tstamp': 348183}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216253, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.409 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap441198e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.417 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap441198e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.417 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.418 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap441198e3-00, col_values=(('external_ids', {'iface-id': 'f4e4745f-6cb7-4dfe-930a-ab5c5f2db11b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:09.419 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.490 2 INFO nova.virt.libvirt.driver [-] [instance: ed125c83-0f73-41e4-925c-db2354932843] Instance destroyed successfully.
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.491 2 DEBUG nova.objects.instance [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lazy-loading 'resources' on Instance uuid ed125c83-0f73-41e4-925c-db2354932843 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.512 2 DEBUG nova.virt.libvirt.vif [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:11:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1025051940',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1025051940',id=6,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:11:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-8hhpdkh2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:11:28Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=ed125c83-0f73-41e4-925c-db2354932843,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "address": "fa:16:3e:52:70:8c", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11fd9ac4-a7", "ovs_interfaceid": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.513 2 DEBUG nova.network.os_vif_util [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converting VIF {"id": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "address": "fa:16:3e:52:70:8c", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11fd9ac4-a7", "ovs_interfaceid": "11fd9ac4-a789-4053-a2ed-1bf04b861368", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.514 2 DEBUG nova.network.os_vif_util [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:70:8c,bridge_name='br-int',has_traffic_filtering=True,id=11fd9ac4-a789-4053-a2ed-1bf04b861368,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11fd9ac4-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.514 2 DEBUG os_vif [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:70:8c,bridge_name='br-int',has_traffic_filtering=True,id=11fd9ac4-a789-4053-a2ed-1bf04b861368,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11fd9ac4-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.519 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11fd9ac4-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.528 2 INFO os_vif [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:70:8c,bridge_name='br-int',has_traffic_filtering=True,id=11fd9ac4-a789-4053-a2ed-1bf04b861368,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11fd9ac4-a7')
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.530 2 INFO nova.virt.libvirt.driver [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Deleting instance files /var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843_del
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.531 2 INFO nova.virt.libvirt.driver [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Deletion of /var/lib/nova/instances/ed125c83-0f73-41e4-925c-db2354932843_del complete
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.589 2 DEBUG nova.virt.libvirt.host [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.590 2 INFO nova.virt.libvirt.host [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] UEFI support detected
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.594 2 INFO nova.compute.manager [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Took 0.37 seconds to destroy the instance on the hypervisor.
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.594 2 DEBUG oslo.service.loopingcall [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.595 2 DEBUG nova.compute.manager [-] [instance: ed125c83-0f73-41e4-925c-db2354932843] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.595 2 DEBUG nova.network.neutron [-] [instance: ed125c83-0f73-41e4-925c-db2354932843] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.933 2 DEBUG nova.compute.manager [req-b705d95f-55af-4567-8774-4c44df40ae3b req-08cfbaf0-2490-412c-8546-ae594b42becd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Received event network-vif-unplugged-11fd9ac4-a789-4053-a2ed-1bf04b861368 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.934 2 DEBUG oslo_concurrency.lockutils [req-b705d95f-55af-4567-8774-4c44df40ae3b req-08cfbaf0-2490-412c-8546-ae594b42becd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "ed125c83-0f73-41e4-925c-db2354932843-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.934 2 DEBUG oslo_concurrency.lockutils [req-b705d95f-55af-4567-8774-4c44df40ae3b req-08cfbaf0-2490-412c-8546-ae594b42becd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.935 2 DEBUG oslo_concurrency.lockutils [req-b705d95f-55af-4567-8774-4c44df40ae3b req-08cfbaf0-2490-412c-8546-ae594b42becd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.935 2 DEBUG nova.compute.manager [req-b705d95f-55af-4567-8774-4c44df40ae3b req-08cfbaf0-2490-412c-8546-ae594b42becd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] No waiting events found dispatching network-vif-unplugged-11fd9ac4-a789-4053-a2ed-1bf04b861368 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:12:09 compute-0 nova_compute[192567]: 2025-10-02 08:12:09.936 2 DEBUG nova.compute.manager [req-b705d95f-55af-4567-8774-4c44df40ae3b req-08cfbaf0-2490-412c-8546-ae594b42becd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Received event network-vif-unplugged-11fd9ac4-a789-4053-a2ed-1bf04b861368 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:12:10 compute-0 nova_compute[192567]: 2025-10-02 08:12:10.250 2 DEBUG nova.network.neutron [-] [instance: ed125c83-0f73-41e4-925c-db2354932843] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:12:10 compute-0 nova_compute[192567]: 2025-10-02 08:12:10.277 2 INFO nova.compute.manager [-] [instance: ed125c83-0f73-41e4-925c-db2354932843] Took 0.68 seconds to deallocate network for instance.
Oct 02 08:12:10 compute-0 nova_compute[192567]: 2025-10-02 08:12:10.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:10 compute-0 nova_compute[192567]: 2025-10-02 08:12:10.344 2 DEBUG oslo_concurrency.lockutils [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:10 compute-0 nova_compute[192567]: 2025-10-02 08:12:10.346 2 DEBUG oslo_concurrency.lockutils [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:10 compute-0 nova_compute[192567]: 2025-10-02 08:12:10.500 2 DEBUG nova.compute.provider_tree [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:12:10 compute-0 nova_compute[192567]: 2025-10-02 08:12:10.522 2 DEBUG nova.scheduler.client.report [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:12:10 compute-0 nova_compute[192567]: 2025-10-02 08:12:10.554 2 DEBUG oslo_concurrency.lockutils [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:10 compute-0 nova_compute[192567]: 2025-10-02 08:12:10.582 2 INFO nova.scheduler.client.report [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Deleted allocations for instance ed125c83-0f73-41e4-925c-db2354932843
Oct 02 08:12:10 compute-0 sshd-session[216238]: Failed password for root from 91.224.92.32 port 49122 ssh2
Oct 02 08:12:10 compute-0 nova_compute[192567]: 2025-10-02 08:12:10.684 2 DEBUG oslo_concurrency.lockutils [None req-66aa4ce9-8809-4cb0-a026-aa335de6a958 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.466s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:11 compute-0 podman[216272]: 2025-10-02 08:12:11.193025638 +0000 UTC m=+0.091140082 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 02 08:12:11 compute-0 podman[216274]: 2025-10-02 08:12:11.216217766 +0000 UTC m=+0.106500348 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 02 08:12:11 compute-0 podman[216273]: 2025-10-02 08:12:11.258759303 +0000 UTC m=+0.155242926 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.304 2 DEBUG oslo_concurrency.lockutils [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "b4d496c6-fc60-476d-84fa-b8183df48147" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.304 2 DEBUG oslo_concurrency.lockutils [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.305 2 DEBUG oslo_concurrency.lockutils [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.305 2 DEBUG oslo_concurrency.lockutils [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.305 2 DEBUG oslo_concurrency.lockutils [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.306 2 INFO nova.compute.manager [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Terminating instance
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.307 2 DEBUG nova.compute.manager [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:12:11 compute-0 kernel: tap71aeead1-a4 (unregistering): left promiscuous mode
Oct 02 08:12:11 compute-0 NetworkManager[51654]: <info>  [1759392731.3342] device (tap71aeead1-a4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:12:11 compute-0 ovn_controller[94821]: 2025-10-02T08:12:11Z|00054|binding|INFO|Releasing lport 71aeead1-a439-4326-93bb-38c3281661f2 from this chassis (sb_readonly=0)
Oct 02 08:12:11 compute-0 ovn_controller[94821]: 2025-10-02T08:12:11Z|00055|binding|INFO|Setting lport 71aeead1-a439-4326-93bb-38c3281661f2 down in Southbound
Oct 02 08:12:11 compute-0 ovn_controller[94821]: 2025-10-02T08:12:11Z|00056|binding|INFO|Removing iface tap71aeead1-a4 ovn-installed in OVS
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.361 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:9b:fd 10.100.0.13'], port_security=['fa:16:3e:bb:9b:fd 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'b4d496c6-fc60-476d-84fa-b8183df48147', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5d6400b4e3f4d98a7456330f6429bd5', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'cba96dbc-c401-4d81-b355-4680d6ad5e15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d06b6f4b-ccde-4903-a1fe-e6bac9f52057, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=71aeead1-a439-4326-93bb-38c3281661f2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.362 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 71aeead1-a439-4326-93bb-38c3281661f2 in datapath 441198e3-04ff-48aa-b8a7-2339e4bb8085 unbound from our chassis
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.364 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 441198e3-04ff-48aa-b8a7-2339e4bb8085
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.381 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[13c85996-2c96-4240-a013-dc928258269c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:11 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Oct 02 08:12:11 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 12.165s CPU time.
Oct 02 08:12:11 compute-0 systemd-machined[152597]: Machine qemu-5-instance-00000005 terminated.
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.421 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[f2065b09-ea26-4d66-a69c-9b0daabf745f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.424 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[2894c981-3927-4bd1-8ece-07650741eefd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.469 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[34638118-4fe1-4c8c-a615-786863e76102]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.498 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[a0f3a884-59b4-48f4-b457-cd971630a64e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap441198e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:13:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 34, 'tx_packets': 15, 'rx_bytes': 1924, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 34, 'tx_packets': 15, 'rx_bytes': 1924, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348164, 'reachable_time': 44332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216345, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.524 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[9140294d-e909-4716-b230-d5c662adcafa]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348180, 'tstamp': 348180}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216346, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348183, 'tstamp': 348183}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216346, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.526 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap441198e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.535 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap441198e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.535 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.537 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap441198e3-00, col_values=(('external_ids', {'iface-id': 'f4e4745f-6cb7-4dfe-930a-ab5c5f2db11b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:11 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:11.538 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.590 2 INFO nova.virt.libvirt.driver [-] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Instance destroyed successfully.
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.591 2 DEBUG nova.objects.instance [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lazy-loading 'resources' on Instance uuid b4d496c6-fc60-476d-84fa-b8183df48147 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.604 2 DEBUG nova.virt.libvirt.vif [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:11:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1549390973',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1549390973',id=5,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:11:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-0i8hs3xf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:12:04Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=b4d496c6-fc60-476d-84fa-b8183df48147,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71aeead1-a439-4326-93bb-38c3281661f2", "address": "fa:16:3e:bb:9b:fd", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71aeead1-a4", "ovs_interfaceid": "71aeead1-a439-4326-93bb-38c3281661f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.605 2 DEBUG nova.network.os_vif_util [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converting VIF {"id": "71aeead1-a439-4326-93bb-38c3281661f2", "address": "fa:16:3e:bb:9b:fd", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71aeead1-a4", "ovs_interfaceid": "71aeead1-a439-4326-93bb-38c3281661f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.606 2 DEBUG nova.network.os_vif_util [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bb:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=71aeead1-a439-4326-93bb-38c3281661f2,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71aeead1-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.607 2 DEBUG os_vif [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bb:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=71aeead1-a439-4326-93bb-38c3281661f2,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71aeead1-a4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.609 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71aeead1-a4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.617 2 INFO os_vif [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bb:9b:fd,bridge_name='br-int',has_traffic_filtering=True,id=71aeead1-a439-4326-93bb-38c3281661f2,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71aeead1-a4')
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.619 2 INFO nova.virt.libvirt.driver [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Deleting instance files /var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147_del
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.631 2 INFO nova.virt.libvirt.driver [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Deletion of /var/lib/nova/instances/b4d496c6-fc60-476d-84fa-b8183df48147_del complete
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.693 2 INFO nova.compute.manager [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Took 0.39 seconds to destroy the instance on the hypervisor.
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.694 2 DEBUG oslo.service.loopingcall [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.694 2 DEBUG nova.compute.manager [-] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:12:11 compute-0 nova_compute[192567]: 2025-10-02 08:12:11.695 2 DEBUG nova.network.neutron [-] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:12:12 compute-0 unix_chkpwd[216364]: password check failed for user (root)
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.212 2 DEBUG nova.compute.manager [req-dec701e1-39e7-40ca-9f4c-5a70ad7fd64b req-1a06d507-461a-4b77-b91a-76cdcebbbe98 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received event network-vif-unplugged-71aeead1-a439-4326-93bb-38c3281661f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.214 2 DEBUG oslo_concurrency.lockutils [req-dec701e1-39e7-40ca-9f4c-5a70ad7fd64b req-1a06d507-461a-4b77-b91a-76cdcebbbe98 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.214 2 DEBUG oslo_concurrency.lockutils [req-dec701e1-39e7-40ca-9f4c-5a70ad7fd64b req-1a06d507-461a-4b77-b91a-76cdcebbbe98 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.214 2 DEBUG oslo_concurrency.lockutils [req-dec701e1-39e7-40ca-9f4c-5a70ad7fd64b req-1a06d507-461a-4b77-b91a-76cdcebbbe98 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.214 2 DEBUG nova.compute.manager [req-dec701e1-39e7-40ca-9f4c-5a70ad7fd64b req-1a06d507-461a-4b77-b91a-76cdcebbbe98 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] No waiting events found dispatching network-vif-unplugged-71aeead1-a439-4326-93bb-38c3281661f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.215 2 DEBUG nova.compute.manager [req-dec701e1-39e7-40ca-9f4c-5a70ad7fd64b req-1a06d507-461a-4b77-b91a-76cdcebbbe98 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received event network-vif-unplugged-71aeead1-a439-4326-93bb-38c3281661f2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.246 2 DEBUG nova.network.neutron [-] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.258 2 DEBUG nova.compute.manager [req-68ec9b42-3288-4e5d-914f-4a067d2b3bf8 req-fee88131-a674-4069-86e6-3c7e5a591db9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Received event network-vif-plugged-11fd9ac4-a789-4053-a2ed-1bf04b861368 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.258 2 DEBUG oslo_concurrency.lockutils [req-68ec9b42-3288-4e5d-914f-4a067d2b3bf8 req-fee88131-a674-4069-86e6-3c7e5a591db9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "ed125c83-0f73-41e4-925c-db2354932843-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.259 2 DEBUG oslo_concurrency.lockutils [req-68ec9b42-3288-4e5d-914f-4a067d2b3bf8 req-fee88131-a674-4069-86e6-3c7e5a591db9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.259 2 DEBUG oslo_concurrency.lockutils [req-68ec9b42-3288-4e5d-914f-4a067d2b3bf8 req-fee88131-a674-4069-86e6-3c7e5a591db9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ed125c83-0f73-41e4-925c-db2354932843-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.259 2 DEBUG nova.compute.manager [req-68ec9b42-3288-4e5d-914f-4a067d2b3bf8 req-fee88131-a674-4069-86e6-3c7e5a591db9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] No waiting events found dispatching network-vif-plugged-11fd9ac4-a789-4053-a2ed-1bf04b861368 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.259 2 WARNING nova.compute.manager [req-68ec9b42-3288-4e5d-914f-4a067d2b3bf8 req-fee88131-a674-4069-86e6-3c7e5a591db9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Received unexpected event network-vif-plugged-11fd9ac4-a789-4053-a2ed-1bf04b861368 for instance with vm_state deleted and task_state None.
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.260 2 DEBUG nova.compute.manager [req-68ec9b42-3288-4e5d-914f-4a067d2b3bf8 req-fee88131-a674-4069-86e6-3c7e5a591db9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ed125c83-0f73-41e4-925c-db2354932843] Received event network-vif-deleted-11fd9ac4-a789-4053-a2ed-1bf04b861368 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.264 2 INFO nova.compute.manager [-] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Took 0.57 seconds to deallocate network for instance.
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.332 2 DEBUG oslo_concurrency.lockutils [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.332 2 DEBUG oslo_concurrency.lockutils [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.338 2 DEBUG oslo_concurrency.lockutils [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.359 2 INFO nova.scheduler.client.report [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Deleted allocations for instance b4d496c6-fc60-476d-84fa-b8183df48147
Oct 02 08:12:12 compute-0 nova_compute[192567]: 2025-10-02 08:12:12.420 2 DEBUG oslo_concurrency.lockutils [None req-7b33db5e-606d-4a0e-847e-8bd84424a8b3 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:13 compute-0 podman[216365]: 2025-10-02 08:12:13.212079903 +0000 UTC m=+0.106990064 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 02 08:12:13 compute-0 nova_compute[192567]: 2025-10-02 08:12:13.788 2 DEBUG oslo_concurrency.lockutils [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "f13a8d11-bf67-4548-81bb-3bfd210a0471" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:13 compute-0 nova_compute[192567]: 2025-10-02 08:12:13.788 2 DEBUG oslo_concurrency.lockutils [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:13 compute-0 nova_compute[192567]: 2025-10-02 08:12:13.789 2 DEBUG oslo_concurrency.lockutils [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:13 compute-0 nova_compute[192567]: 2025-10-02 08:12:13.789 2 DEBUG oslo_concurrency.lockutils [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:13 compute-0 nova_compute[192567]: 2025-10-02 08:12:13.789 2 DEBUG oslo_concurrency.lockutils [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:13 compute-0 nova_compute[192567]: 2025-10-02 08:12:13.791 2 INFO nova.compute.manager [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Terminating instance
Oct 02 08:12:13 compute-0 nova_compute[192567]: 2025-10-02 08:12:13.793 2 DEBUG nova.compute.manager [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:12:13 compute-0 kernel: tapea9a363a-d8 (unregistering): left promiscuous mode
Oct 02 08:12:13 compute-0 NetworkManager[51654]: <info>  [1759392733.8239] device (tapea9a363a-d8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:12:13 compute-0 nova_compute[192567]: 2025-10-02 08:12:13.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:13 compute-0 ovn_controller[94821]: 2025-10-02T08:12:13Z|00057|binding|INFO|Releasing lport ea9a363a-d800-41d1-b7a3-819f91395719 from this chassis (sb_readonly=0)
Oct 02 08:12:13 compute-0 ovn_controller[94821]: 2025-10-02T08:12:13Z|00058|binding|INFO|Setting lport ea9a363a-d800-41d1-b7a3-819f91395719 down in Southbound
Oct 02 08:12:13 compute-0 ovn_controller[94821]: 2025-10-02T08:12:13Z|00059|binding|INFO|Removing iface tapea9a363a-d8 ovn-installed in OVS
Oct 02 08:12:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:13.886 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:27:70 10.100.0.9'], port_security=['fa:16:3e:7c:27:70 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f13a8d11-bf67-4548-81bb-3bfd210a0471', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5d6400b4e3f4d98a7456330f6429bd5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cba96dbc-c401-4d81-b355-4680d6ad5e15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d06b6f4b-ccde-4903-a1fe-e6bac9f52057, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=ea9a363a-d800-41d1-b7a3-819f91395719) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:12:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:13.887 103703 INFO neutron.agent.ovn.metadata.agent [-] Port ea9a363a-d800-41d1-b7a3-819f91395719 in datapath 441198e3-04ff-48aa-b8a7-2339e4bb8085 unbound from our chassis
Oct 02 08:12:13 compute-0 nova_compute[192567]: 2025-10-02 08:12:13.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:13.889 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 441198e3-04ff-48aa-b8a7-2339e4bb8085
Oct 02 08:12:13 compute-0 nova_compute[192567]: 2025-10-02 08:12:13.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:13.914 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[c4bdde4d-0f1f-4274-a134-78e9ac4f2f3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:13 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Oct 02 08:12:13 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 16.064s CPU time.
Oct 02 08:12:13 compute-0 systemd-machined[152597]: Machine qemu-2-instance-00000004 terminated.
Oct 02 08:12:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:13.962 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[7f245845-cbf9-4f96-b647-f96334509a7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:13.966 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[691f8700-7077-4c72-9d14-a43173dc7bf0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:14 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:14.012 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[608f1b74-6bfc-4570-9275-22bc71064f22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:14 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:14.043 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[a24ce8d8-c6df-456a-bf6c-46226bc91661]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap441198e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:13:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 34, 'tx_packets': 17, 'rx_bytes': 1924, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 34, 'tx_packets': 17, 'rx_bytes': 1924, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348164, 'reachable_time': 44332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216402, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:14 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:14.072 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[5f831a8a-5d34-4339-9694-e2b55bb69eff]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348180, 'tstamp': 348180}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216412, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348183, 'tstamp': 348183}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216412, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:14 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:14.074 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap441198e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:14 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:14.084 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap441198e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:14 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:14.084 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:12:14 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:14.085 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap441198e3-00, col_values=(('external_ids', {'iface-id': 'f4e4745f-6cb7-4dfe-930a-ab5c5f2db11b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:14 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:14.086 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.094 2 INFO nova.virt.libvirt.driver [-] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Instance destroyed successfully.
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.094 2 DEBUG nova.objects.instance [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lazy-loading 'resources' on Instance uuid f13a8d11-bf67-4548-81bb-3bfd210a0471 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.116 2 DEBUG nova.virt.libvirt.vif [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:10:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-922374601',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-922374601',id=4,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:10:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-itjzem6y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:10:50Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=f13a8d11-bf67-4548-81bb-3bfd210a0471,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea9a363a-d800-41d1-b7a3-819f91395719", "address": "fa:16:3e:7c:27:70", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea9a363a-d8", "ovs_interfaceid": "ea9a363a-d800-41d1-b7a3-819f91395719", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.116 2 DEBUG nova.network.os_vif_util [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converting VIF {"id": "ea9a363a-d800-41d1-b7a3-819f91395719", "address": "fa:16:3e:7c:27:70", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea9a363a-d8", "ovs_interfaceid": "ea9a363a-d800-41d1-b7a3-819f91395719", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.117 2 DEBUG nova.network.os_vif_util [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:27:70,bridge_name='br-int',has_traffic_filtering=True,id=ea9a363a-d800-41d1-b7a3-819f91395719,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea9a363a-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.118 2 DEBUG os_vif [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:27:70,bridge_name='br-int',has_traffic_filtering=True,id=ea9a363a-d800-41d1-b7a3-819f91395719,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea9a363a-d8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.120 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea9a363a-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.128 2 INFO os_vif [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:27:70,bridge_name='br-int',has_traffic_filtering=True,id=ea9a363a-d800-41d1-b7a3-819f91395719,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea9a363a-d8')
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.129 2 INFO nova.virt.libvirt.driver [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Deleting instance files /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471_del
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.130 2 INFO nova.virt.libvirt.driver [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Deletion of /var/lib/nova/instances/f13a8d11-bf67-4548-81bb-3bfd210a0471_del complete
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.209 2 INFO nova.compute.manager [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Took 0.42 seconds to destroy the instance on the hypervisor.
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.209 2 DEBUG oslo.service.loopingcall [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.210 2 DEBUG nova.compute.manager [-] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.210 2 DEBUG nova.network.neutron [-] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.322 2 DEBUG nova.compute.manager [req-b9a0a87f-32f2-4165-93ba-c4808acc1a53 req-3e34a283-e368-4cfa-9692-21201fd6a6ee 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received event network-vif-plugged-71aeead1-a439-4326-93bb-38c3281661f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.322 2 DEBUG oslo_concurrency.lockutils [req-b9a0a87f-32f2-4165-93ba-c4808acc1a53 req-3e34a283-e368-4cfa-9692-21201fd6a6ee 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.323 2 DEBUG oslo_concurrency.lockutils [req-b9a0a87f-32f2-4165-93ba-c4808acc1a53 req-3e34a283-e368-4cfa-9692-21201fd6a6ee 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.323 2 DEBUG oslo_concurrency.lockutils [req-b9a0a87f-32f2-4165-93ba-c4808acc1a53 req-3e34a283-e368-4cfa-9692-21201fd6a6ee 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b4d496c6-fc60-476d-84fa-b8183df48147-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.323 2 DEBUG nova.compute.manager [req-b9a0a87f-32f2-4165-93ba-c4808acc1a53 req-3e34a283-e368-4cfa-9692-21201fd6a6ee 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] No waiting events found dispatching network-vif-plugged-71aeead1-a439-4326-93bb-38c3281661f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.324 2 WARNING nova.compute.manager [req-b9a0a87f-32f2-4165-93ba-c4808acc1a53 req-3e34a283-e368-4cfa-9692-21201fd6a6ee 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received unexpected event network-vif-plugged-71aeead1-a439-4326-93bb-38c3281661f2 for instance with vm_state deleted and task_state None.
Oct 02 08:12:14 compute-0 nova_compute[192567]: 2025-10-02 08:12:14.390 2 DEBUG nova.compute.manager [req-18e56902-cbe6-403c-a73d-c5ca00a746f4 req-2c7f39f0-aa56-46f3-a3b8-8a2a7fda97aa 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Received event network-vif-deleted-71aeead1-a439-4326-93bb-38c3281661f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:14 compute-0 sshd-session[216238]: Failed password for root from 91.224.92.32 port 49122 ssh2
Oct 02 08:12:15 compute-0 nova_compute[192567]: 2025-10-02 08:12:15.148 2 DEBUG nova.network.neutron [-] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:12:15 compute-0 nova_compute[192567]: 2025-10-02 08:12:15.166 2 INFO nova.compute.manager [-] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Took 0.96 seconds to deallocate network for instance.
Oct 02 08:12:15 compute-0 nova_compute[192567]: 2025-10-02 08:12:15.226 2 DEBUG oslo_concurrency.lockutils [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:15 compute-0 nova_compute[192567]: 2025-10-02 08:12:15.226 2 DEBUG oslo_concurrency.lockutils [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:15 compute-0 nova_compute[192567]: 2025-10-02 08:12:15.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:15 compute-0 nova_compute[192567]: 2025-10-02 08:12:15.334 2 DEBUG nova.compute.provider_tree [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:12:15 compute-0 nova_compute[192567]: 2025-10-02 08:12:15.361 2 DEBUG nova.scheduler.client.report [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:12:15 compute-0 nova_compute[192567]: 2025-10-02 08:12:15.397 2 DEBUG oslo_concurrency.lockutils [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:15 compute-0 nova_compute[192567]: 2025-10-02 08:12:15.437 2 INFO nova.scheduler.client.report [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Deleted allocations for instance f13a8d11-bf67-4548-81bb-3bfd210a0471
Oct 02 08:12:15 compute-0 nova_compute[192567]: 2025-10-02 08:12:15.545 2 DEBUG oslo_concurrency.lockutils [None req-61f3ae3b-1065-481b-8c19-9cc8924454b0 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.304 2 DEBUG oslo_concurrency.lockutils [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "99ec0256-cf67-4122-81b7-d0767c5a1347" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.305 2 DEBUG oslo_concurrency.lockutils [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "99ec0256-cf67-4122-81b7-d0767c5a1347" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.305 2 DEBUG oslo_concurrency.lockutils [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "99ec0256-cf67-4122-81b7-d0767c5a1347-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.306 2 DEBUG oslo_concurrency.lockutils [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "99ec0256-cf67-4122-81b7-d0767c5a1347-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.306 2 DEBUG oslo_concurrency.lockutils [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "99ec0256-cf67-4122-81b7-d0767c5a1347-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.308 2 INFO nova.compute.manager [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Terminating instance
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.310 2 DEBUG nova.compute.manager [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:12:16 compute-0 kernel: tap3c8fd5e6-22 (unregistering): left promiscuous mode
Oct 02 08:12:16 compute-0 NetworkManager[51654]: <info>  [1759392736.3438] device (tap3c8fd5e6-22): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:12:16 compute-0 unix_chkpwd[216422]: password check failed for user (root)
Oct 02 08:12:16 compute-0 ovn_controller[94821]: 2025-10-02T08:12:16Z|00060|binding|INFO|Releasing lport 3c8fd5e6-225b-47a2-81df-d81b2d20fa8b from this chassis (sb_readonly=0)
Oct 02 08:12:16 compute-0 ovn_controller[94821]: 2025-10-02T08:12:16Z|00061|binding|INFO|Setting lport 3c8fd5e6-225b-47a2-81df-d81b2d20fa8b down in Southbound
Oct 02 08:12:16 compute-0 ovn_controller[94821]: 2025-10-02T08:12:16Z|00062|binding|INFO|Removing iface tap3c8fd5e6-22 ovn-installed in OVS
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.417 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:db:a8 10.100.0.10'], port_security=['fa:16:3e:85:db:a8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '99ec0256-cf67-4122-81b7-d0767c5a1347', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5d6400b4e3f4d98a7456330f6429bd5', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'cba96dbc-c401-4d81-b355-4680d6ad5e15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d06b6f4b-ccde-4903-a1fe-e6bac9f52057, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=3c8fd5e6-225b-47a2-81df-d81b2d20fa8b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.421 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 3c8fd5e6-225b-47a2-81df-d81b2d20fa8b in datapath 441198e3-04ff-48aa-b8a7-2339e4bb8085 unbound from our chassis
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.423 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 441198e3-04ff-48aa-b8a7-2339e4bb8085
Oct 02 08:12:16 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Deactivated successfully.
Oct 02 08:12:16 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Consumed 2.821s CPU time.
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.451 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e99a1b25-8856-45e7-9569-629f35bfbde6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:16 compute-0 systemd-machined[152597]: Machine qemu-4-instance-00000003 terminated.
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.501 2 DEBUG nova.compute.manager [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Received event network-vif-unplugged-ea9a363a-d800-41d1-b7a3-819f91395719 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.501 2 DEBUG oslo_concurrency.lockutils [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.501 2 DEBUG oslo_concurrency.lockutils [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.501 2 DEBUG oslo_concurrency.lockutils [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.502 2 DEBUG nova.compute.manager [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] No waiting events found dispatching network-vif-unplugged-ea9a363a-d800-41d1-b7a3-819f91395719 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.502 2 WARNING nova.compute.manager [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Received unexpected event network-vif-unplugged-ea9a363a-d800-41d1-b7a3-819f91395719 for instance with vm_state deleted and task_state None.
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.502 2 DEBUG nova.compute.manager [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Received event network-vif-plugged-ea9a363a-d800-41d1-b7a3-819f91395719 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.502 2 DEBUG oslo_concurrency.lockutils [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.503 2 DEBUG oslo_concurrency.lockutils [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.503 2 DEBUG oslo_concurrency.lockutils [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f13a8d11-bf67-4548-81bb-3bfd210a0471-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.503 2 DEBUG nova.compute.manager [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] No waiting events found dispatching network-vif-plugged-ea9a363a-d800-41d1-b7a3-819f91395719 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.503 2 WARNING nova.compute.manager [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Received unexpected event network-vif-plugged-ea9a363a-d800-41d1-b7a3-819f91395719 for instance with vm_state deleted and task_state None.
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.504 2 DEBUG nova.compute.manager [req-d795135b-84fc-4f66-acf4-369cca3a2703 req-b02a9320-40b6-4410-9d28-19ad03fa9872 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Received event network-vif-deleted-ea9a363a-d800-41d1-b7a3-819f91395719 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.507 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0e6ae2-f0df-407b-b883-e6dd2fc8b298]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.512 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[cce06d5a-3756-4537-a6cc-78e501397d27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:16 compute-0 podman[216424]: 2025-10-02 08:12:16.565312797 +0000 UTC m=+0.096780597 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.578 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[d81578d6-c4b1-46f9-a112-89cccf28783d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.604 2 INFO nova.virt.libvirt.driver [-] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Instance destroyed successfully.
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.607 2 DEBUG nova.objects.instance [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lazy-loading 'resources' on Instance uuid 99ec0256-cf67-4122-81b7-d0767c5a1347 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.613 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe1833d-be56-4d3f-a7fa-6e579be9e939]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap441198e3-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9d:13:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 34, 'tx_packets': 19, 'rx_bytes': 1924, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 34, 'tx_packets': 19, 'rx_bytes': 1924, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348164, 'reachable_time': 44332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216465, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.625 2 DEBUG nova.virt.libvirt.vif [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T08:10:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-697284282',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-697284282',id=3,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:10:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-vaul1op1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',clean_attempts='1',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:11:58Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=99ec0256-cf67-4122-81b7-d0767c5a1347,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c8fd5e6-225b-47a2-81df-d81b2d20fa8b", "address": "fa:16:3e:85:db:a8", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c8fd5e6-22", "ovs_interfaceid": "3c8fd5e6-225b-47a2-81df-d81b2d20fa8b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.627 2 DEBUG nova.network.os_vif_util [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converting VIF {"id": "3c8fd5e6-225b-47a2-81df-d81b2d20fa8b", "address": "fa:16:3e:85:db:a8", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c8fd5e6-22", "ovs_interfaceid": "3c8fd5e6-225b-47a2-81df-d81b2d20fa8b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.628 2 DEBUG nova.network.os_vif_util [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:85:db:a8,bridge_name='br-int',has_traffic_filtering=True,id=3c8fd5e6-225b-47a2-81df-d81b2d20fa8b,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c8fd5e6-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.629 2 DEBUG os_vif [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:db:a8,bridge_name='br-int',has_traffic_filtering=True,id=3c8fd5e6-225b-47a2-81df-d81b2d20fa8b,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c8fd5e6-22') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.633 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c8fd5e6-22, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.642 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[9074bfad-7108-4ddc-b0a4-15056a8b0ea5]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348180, 'tstamp': 348180}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216467, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap441198e3-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348183, 'tstamp': 348183}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216467, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.645 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap441198e3-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.643 2 INFO os_vif [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:db:a8,bridge_name='br-int',has_traffic_filtering=True,id=3c8fd5e6-225b-47a2-81df-d81b2d20fa8b,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c8fd5e6-22')
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.644 2 INFO nova.virt.libvirt.driver [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Deleting instance files /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347_del
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.646 2 INFO nova.virt.libvirt.driver [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Deletion of /var/lib/nova/instances/99ec0256-cf67-4122-81b7-d0767c5a1347_del complete
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.649 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap441198e3-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.649 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.650 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap441198e3-00, col_values=(('external_ids', {'iface-id': 'f4e4745f-6cb7-4dfe-930a-ab5c5f2db11b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:16.651 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.707 2 INFO nova.compute.manager [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Took 0.40 seconds to destroy the instance on the hypervisor.
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.708 2 DEBUG oslo.service.loopingcall [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.709 2 DEBUG nova.compute.manager [-] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.709 2 DEBUG nova.network.neutron [-] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.887 2 DEBUG nova.compute.manager [req-7a13977d-5b2d-4c9c-b080-1ba757fa1908 req-ed0b3eca-2d75-48dc-9cc6-0c7738a57874 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Received event network-vif-unplugged-3c8fd5e6-225b-47a2-81df-d81b2d20fa8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.888 2 DEBUG oslo_concurrency.lockutils [req-7a13977d-5b2d-4c9c-b080-1ba757fa1908 req-ed0b3eca-2d75-48dc-9cc6-0c7738a57874 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "99ec0256-cf67-4122-81b7-d0767c5a1347-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.888 2 DEBUG oslo_concurrency.lockutils [req-7a13977d-5b2d-4c9c-b080-1ba757fa1908 req-ed0b3eca-2d75-48dc-9cc6-0c7738a57874 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "99ec0256-cf67-4122-81b7-d0767c5a1347-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.889 2 DEBUG oslo_concurrency.lockutils [req-7a13977d-5b2d-4c9c-b080-1ba757fa1908 req-ed0b3eca-2d75-48dc-9cc6-0c7738a57874 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "99ec0256-cf67-4122-81b7-d0767c5a1347-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.889 2 DEBUG nova.compute.manager [req-7a13977d-5b2d-4c9c-b080-1ba757fa1908 req-ed0b3eca-2d75-48dc-9cc6-0c7738a57874 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] No waiting events found dispatching network-vif-unplugged-3c8fd5e6-225b-47a2-81df-d81b2d20fa8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:12:16 compute-0 nova_compute[192567]: 2025-10-02 08:12:16.889 2 DEBUG nova.compute.manager [req-7a13977d-5b2d-4c9c-b080-1ba757fa1908 req-ed0b3eca-2d75-48dc-9cc6-0c7738a57874 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Received event network-vif-unplugged-3c8fd5e6-225b-47a2-81df-d81b2d20fa8b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:12:17 compute-0 nova_compute[192567]: 2025-10-02 08:12:17.272 2 DEBUG nova.network.neutron [-] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:12:17 compute-0 nova_compute[192567]: 2025-10-02 08:12:17.295 2 INFO nova.compute.manager [-] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Took 0.59 seconds to deallocate network for instance.
Oct 02 08:12:17 compute-0 nova_compute[192567]: 2025-10-02 08:12:17.332 2 DEBUG oslo_concurrency.lockutils [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:17 compute-0 nova_compute[192567]: 2025-10-02 08:12:17.333 2 DEBUG oslo_concurrency.lockutils [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:17 compute-0 nova_compute[192567]: 2025-10-02 08:12:17.339 2 DEBUG oslo_concurrency.lockutils [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:17 compute-0 nova_compute[192567]: 2025-10-02 08:12:17.489 2 INFO nova.scheduler.client.report [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Deleted allocations for instance 99ec0256-cf67-4122-81b7-d0767c5a1347
Oct 02 08:12:17 compute-0 nova_compute[192567]: 2025-10-02 08:12:17.551 2 DEBUG oslo_concurrency.lockutils [None req-a740aee8-dd71-4937-a231-c92d8b73ae25 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "99ec0256-cf67-4122-81b7-d0767c5a1347" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:18 compute-0 sshd-session[216238]: Failed password for root from 91.224.92.32 port 49122 ssh2
Oct 02 08:12:18 compute-0 sshd-session[216238]: Received disconnect from 91.224.92.32 port 49122:11:  [preauth]
Oct 02 08:12:18 compute-0 sshd-session[216238]: Disconnected from authenticating user root 91.224.92.32 port 49122 [preauth]
Oct 02 08:12:18 compute-0 sshd-session[216238]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 02 08:12:18 compute-0 nova_compute[192567]: 2025-10-02 08:12:18.623 2 DEBUG nova.compute.manager [req-ab025b8c-b320-4ade-b641-56ac62436350 req-52d08053-29e7-4679-8e35-98c947ec195c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Received event network-vif-deleted-3c8fd5e6-225b-47a2-81df-d81b2d20fa8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:18 compute-0 nova_compute[192567]: 2025-10-02 08:12:18.979 2 DEBUG nova.compute.manager [req-1ec61a48-b08f-4935-b131-f22d460b10f1 req-9ba85e89-7c89-4bfc-b3b1-61ee08ebb357 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Received event network-vif-plugged-3c8fd5e6-225b-47a2-81df-d81b2d20fa8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:18 compute-0 nova_compute[192567]: 2025-10-02 08:12:18.981 2 DEBUG oslo_concurrency.lockutils [req-1ec61a48-b08f-4935-b131-f22d460b10f1 req-9ba85e89-7c89-4bfc-b3b1-61ee08ebb357 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "99ec0256-cf67-4122-81b7-d0767c5a1347-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:18 compute-0 nova_compute[192567]: 2025-10-02 08:12:18.981 2 DEBUG oslo_concurrency.lockutils [req-1ec61a48-b08f-4935-b131-f22d460b10f1 req-9ba85e89-7c89-4bfc-b3b1-61ee08ebb357 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "99ec0256-cf67-4122-81b7-d0767c5a1347-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:18 compute-0 nova_compute[192567]: 2025-10-02 08:12:18.982 2 DEBUG oslo_concurrency.lockutils [req-1ec61a48-b08f-4935-b131-f22d460b10f1 req-9ba85e89-7c89-4bfc-b3b1-61ee08ebb357 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "99ec0256-cf67-4122-81b7-d0767c5a1347-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:18 compute-0 nova_compute[192567]: 2025-10-02 08:12:18.982 2 DEBUG nova.compute.manager [req-1ec61a48-b08f-4935-b131-f22d460b10f1 req-9ba85e89-7c89-4bfc-b3b1-61ee08ebb357 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] No waiting events found dispatching network-vif-plugged-3c8fd5e6-225b-47a2-81df-d81b2d20fa8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:12:18 compute-0 nova_compute[192567]: 2025-10-02 08:12:18.982 2 WARNING nova.compute.manager [req-1ec61a48-b08f-4935-b131-f22d460b10f1 req-9ba85e89-7c89-4bfc-b3b1-61ee08ebb357 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Received unexpected event network-vif-plugged-3c8fd5e6-225b-47a2-81df-d81b2d20fa8b for instance with vm_state deleted and task_state None.
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.281 2 DEBUG oslo_concurrency.lockutils [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "2e661e5f-2462-4ffd-99a7-afc83d45f425" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.281 2 DEBUG oslo_concurrency.lockutils [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.282 2 DEBUG oslo_concurrency.lockutils [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.282 2 DEBUG oslo_concurrency.lockutils [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.282 2 DEBUG oslo_concurrency.lockutils [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.284 2 INFO nova.compute.manager [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Terminating instance
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.285 2 DEBUG nova.compute.manager [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:12:19 compute-0 kernel: tap782354d7-24 (unregistering): left promiscuous mode
Oct 02 08:12:19 compute-0 NetworkManager[51654]: <info>  [1759392739.3112] device (tap782354d7-24): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:19 compute-0 ovn_controller[94821]: 2025-10-02T08:12:19Z|00063|binding|INFO|Releasing lport 782354d7-2469-4521-9850-4777d41a0047 from this chassis (sb_readonly=0)
Oct 02 08:12:19 compute-0 ovn_controller[94821]: 2025-10-02T08:12:19Z|00064|binding|INFO|Setting lport 782354d7-2469-4521-9850-4777d41a0047 down in Southbound
Oct 02 08:12:19 compute-0 ovn_controller[94821]: 2025-10-02T08:12:19Z|00065|binding|INFO|Removing iface tap782354d7-24 ovn-installed in OVS
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.331 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:f4:5e 10.100.0.8'], port_security=['fa:16:3e:6c:f4:5e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '2e661e5f-2462-4ffd-99a7-afc83d45f425', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a5d6400b4e3f4d98a7456330f6429bd5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cba96dbc-c401-4d81-b355-4680d6ad5e15', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d06b6f4b-ccde-4903-a1fe-e6bac9f52057, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=782354d7-2469-4521-9850-4777d41a0047) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.334 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 782354d7-2469-4521-9850-4777d41a0047 in datapath 441198e3-04ff-48aa-b8a7-2339e4bb8085 unbound from our chassis
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.337 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 441198e3-04ff-48aa-b8a7-2339e4bb8085, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.338 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[227cdec5-9046-4d4e-87ba-6cc84baf6145]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.339 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085 namespace which is not needed anymore
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:19 compute-0 unix_chkpwd[216482]: password check failed for user (root)
Oct 02 08:12:19 compute-0 sshd-session[216468]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 02 08:12:19 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Oct 02 08:12:19 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 20.513s CPU time.
Oct 02 08:12:19 compute-0 systemd-machined[152597]: Machine qemu-1-instance-00000002 terminated.
Oct 02 08:12:19 compute-0 neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085[215288]: [NOTICE]   (215292) : haproxy version is 2.8.14-c23fe91
Oct 02 08:12:19 compute-0 neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085[215288]: [NOTICE]   (215292) : path to executable is /usr/sbin/haproxy
Oct 02 08:12:19 compute-0 neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085[215288]: [WARNING]  (215292) : Exiting Master process...
Oct 02 08:12:19 compute-0 neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085[215288]: [ALERT]    (215292) : Current worker (215294) exited with code 143 (Terminated)
Oct 02 08:12:19 compute-0 neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085[215288]: [WARNING]  (215292) : All workers exited. Exiting... (0)
Oct 02 08:12:19 compute-0 systemd[1]: libpod-6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb.scope: Deactivated successfully.
Oct 02 08:12:19 compute-0 podman[216496]: 2025-10-02 08:12:19.573197332 +0000 UTC m=+0.075790058 container died 6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.579 2 INFO nova.virt.libvirt.driver [-] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Instance destroyed successfully.
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.580 2 DEBUG nova.objects.instance [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lazy-loading 'resources' on Instance uuid 2e661e5f-2462-4ffd-99a7-afc83d45f425 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.599 2 DEBUG nova.virt.libvirt.vif [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:09:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1330242329',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1330242329',id=2,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:09:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a5d6400b4e3f4d98a7456330f6429bd5',ramdisk_id='',reservation_id='r-ftj0v7vg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-547955480',owner_user_name='tempest-TestExecuteActionsViaActuator-547955480-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:09:51Z,user_data=None,user_id='4b5c71b386a34e829eef47bf613d813c',uuid=2e661e5f-2462-4ffd-99a7-afc83d45f425,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "782354d7-2469-4521-9850-4777d41a0047", "address": "fa:16:3e:6c:f4:5e", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap782354d7-24", "ovs_interfaceid": "782354d7-2469-4521-9850-4777d41a0047", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.600 2 DEBUG nova.network.os_vif_util [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converting VIF {"id": "782354d7-2469-4521-9850-4777d41a0047", "address": "fa:16:3e:6c:f4:5e", "network": {"id": "441198e3-04ff-48aa-b8a7-2339e4bb8085", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-653484411-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38d9b1ca205a4fa391e840136db0d930", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap782354d7-24", "ovs_interfaceid": "782354d7-2469-4521-9850-4777d41a0047", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.601 2 DEBUG nova.network.os_vif_util [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:f4:5e,bridge_name='br-int',has_traffic_filtering=True,id=782354d7-2469-4521-9850-4777d41a0047,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap782354d7-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.601 2 DEBUG os_vif [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:f4:5e,bridge_name='br-int',has_traffic_filtering=True,id=782354d7-2469-4521-9850-4777d41a0047,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap782354d7-24') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.605 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap782354d7-24, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.611 2 INFO os_vif [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:f4:5e,bridge_name='br-int',has_traffic_filtering=True,id=782354d7-2469-4521-9850-4777d41a0047,network=Network(441198e3-04ff-48aa-b8a7-2339e4bb8085),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap782354d7-24')
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.612 2 INFO nova.virt.libvirt.driver [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Deleting instance files /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425_del
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.614 2 INFO nova.virt.libvirt.driver [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Deletion of /var/lib/nova/instances/2e661e5f-2462-4ffd-99a7-afc83d45f425_del complete
Oct 02 08:12:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb-userdata-shm.mount: Deactivated successfully.
Oct 02 08:12:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-8d48ba9d0cf4319b7524e50130d2054a00142945398b8772533cd818497f0464-merged.mount: Deactivated successfully.
Oct 02 08:12:19 compute-0 podman[216496]: 2025-10-02 08:12:19.636986396 +0000 UTC m=+0.139579072 container cleanup 6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 02 08:12:19 compute-0 systemd[1]: libpod-conmon-6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb.scope: Deactivated successfully.
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.696 2 INFO nova.compute.manager [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Took 0.41 seconds to destroy the instance on the hypervisor.
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.697 2 DEBUG oslo.service.loopingcall [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.698 2 DEBUG nova.compute.manager [-] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.698 2 DEBUG nova.network.neutron [-] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:12:19 compute-0 podman[216537]: 2025-10-02 08:12:19.732566604 +0000 UTC m=+0.061009079 container remove 6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.739 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[fdbfc0e1-db2c-4c8a-aea9-aad4af128dc8]: (4, ('Thu Oct  2 08:12:19 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085 (6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb)\n6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb\nThu Oct  2 08:12:19 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085 (6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb)\n6816546c501eb60286f44f8f71bf334dfa8ef3a4e8feb59f5072a2bb309b0dfb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.741 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[a994bd02-3051-40a4-807e-16ec295c18d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.742 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap441198e3-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:19 compute-0 kernel: tap441198e3-00: left promiscuous mode
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:19 compute-0 nova_compute[192567]: 2025-10-02 08:12:19.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.810 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[0c8e6e59-75b8-4280-968f-94b195c896b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.837 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[731e495a-d774-47f1-af7d-e9deacccf076]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.838 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[0abd830f-a2e8-45a8-a9ca-74dfa894bdd5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.859 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f8e7ad7d-d2d4-4b39-b013-932b07be9e58]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348153, 'reachable_time': 26441, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216551, 'error': None, 'target': 'ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.874 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-441198e3-04ff-48aa-b8a7-2339e4bb8085 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:12:19 compute-0 systemd[1]: run-netns-ovnmeta\x2d441198e3\x2d04ff\x2d48aa\x2db8a7\x2d2339e4bb8085.mount: Deactivated successfully.
Oct 02 08:12:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:19.875 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[90a54466-10dc-4869-8baa-94608965984f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.168 2 DEBUG nova.network.neutron [-] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.185 2 INFO nova.compute.manager [-] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Took 0.49 seconds to deallocate network for instance.
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.222 2 DEBUG oslo_concurrency.lockutils [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.223 2 DEBUG oslo_concurrency.lockutils [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.291 2 DEBUG nova.compute.provider_tree [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.304 2 DEBUG nova.scheduler.client.report [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.326 2 DEBUG oslo_concurrency.lockutils [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.361 2 INFO nova.scheduler.client.report [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Deleted allocations for instance 2e661e5f-2462-4ffd-99a7-afc83d45f425
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.438 2 DEBUG oslo_concurrency.lockutils [None req-ce15003a-1a64-42c9-a06b-3b0cc1b36d7e 4b5c71b386a34e829eef47bf613d813c a5d6400b4e3f4d98a7456330f6429bd5 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.650 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.650 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.651 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.651 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.732 2 DEBUG nova.compute.manager [req-3d39e926-9da9-48d4-a857-6e0a5e6dd43d req-edca6f4d-8365-4035-9dec-ce288c3dff0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Received event network-vif-unplugged-782354d7-2469-4521-9850-4777d41a0047 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.732 2 DEBUG oslo_concurrency.lockutils [req-3d39e926-9da9-48d4-a857-6e0a5e6dd43d req-edca6f4d-8365-4035-9dec-ce288c3dff0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.733 2 DEBUG oslo_concurrency.lockutils [req-3d39e926-9da9-48d4-a857-6e0a5e6dd43d req-edca6f4d-8365-4035-9dec-ce288c3dff0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.733 2 DEBUG oslo_concurrency.lockutils [req-3d39e926-9da9-48d4-a857-6e0a5e6dd43d req-edca6f4d-8365-4035-9dec-ce288c3dff0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.734 2 DEBUG nova.compute.manager [req-3d39e926-9da9-48d4-a857-6e0a5e6dd43d req-edca6f4d-8365-4035-9dec-ce288c3dff0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] No waiting events found dispatching network-vif-unplugged-782354d7-2469-4521-9850-4777d41a0047 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.734 2 WARNING nova.compute.manager [req-3d39e926-9da9-48d4-a857-6e0a5e6dd43d req-edca6f4d-8365-4035-9dec-ce288c3dff0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Received unexpected event network-vif-unplugged-782354d7-2469-4521-9850-4777d41a0047 for instance with vm_state deleted and task_state None.
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.735 2 DEBUG nova.compute.manager [req-3d39e926-9da9-48d4-a857-6e0a5e6dd43d req-edca6f4d-8365-4035-9dec-ce288c3dff0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Received event network-vif-plugged-782354d7-2469-4521-9850-4777d41a0047 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.735 2 DEBUG oslo_concurrency.lockutils [req-3d39e926-9da9-48d4-a857-6e0a5e6dd43d req-edca6f4d-8365-4035-9dec-ce288c3dff0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.735 2 DEBUG oslo_concurrency.lockutils [req-3d39e926-9da9-48d4-a857-6e0a5e6dd43d req-edca6f4d-8365-4035-9dec-ce288c3dff0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.736 2 DEBUG oslo_concurrency.lockutils [req-3d39e926-9da9-48d4-a857-6e0a5e6dd43d req-edca6f4d-8365-4035-9dec-ce288c3dff0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2e661e5f-2462-4ffd-99a7-afc83d45f425-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.736 2 DEBUG nova.compute.manager [req-3d39e926-9da9-48d4-a857-6e0a5e6dd43d req-edca6f4d-8365-4035-9dec-ce288c3dff0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] No waiting events found dispatching network-vif-plugged-782354d7-2469-4521-9850-4777d41a0047 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.737 2 WARNING nova.compute.manager [req-3d39e926-9da9-48d4-a857-6e0a5e6dd43d req-edca6f4d-8365-4035-9dec-ce288c3dff0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Received unexpected event network-vif-plugged-782354d7-2469-4521-9850-4777d41a0047 for instance with vm_state deleted and task_state None.
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.845 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.846 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5840MB free_disk=73.46946334838867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.847 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.847 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.888 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.888 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.909 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.928 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.948 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:12:20 compute-0 nova_compute[192567]: 2025-10-02 08:12:20.948 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:21 compute-0 nova_compute[192567]: 2025-10-02 08:12:21.091 2 DEBUG nova.compute.manager [req-36499152-07d3-43c9-a9da-7c60dfe25421 req-4ad0fe87-c83b-4d6c-aa49-57dc11cdb070 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Received event network-vif-deleted-782354d7-2469-4521-9850-4777d41a0047 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:12:22 compute-0 sshd-session[216468]: Failed password for root from 91.224.92.32 port 33326 ssh2
Oct 02 08:12:23 compute-0 unix_chkpwd[216554]: password check failed for user (root)
Oct 02 08:12:23 compute-0 nova_compute[192567]: 2025-10-02 08:12:23.944 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:12:23 compute-0 nova_compute[192567]: 2025-10-02 08:12:23.945 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:12:23 compute-0 nova_compute[192567]: 2025-10-02 08:12:23.945 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:12:24 compute-0 nova_compute[192567]: 2025-10-02 08:12:24.010 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:12:24 compute-0 nova_compute[192567]: 2025-10-02 08:12:24.012 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:12:24 compute-0 nova_compute[192567]: 2025-10-02 08:12:24.013 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:12:24 compute-0 nova_compute[192567]: 2025-10-02 08:12:24.488 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759392729.4858367, ed125c83-0f73-41e4-925c-db2354932843 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:12:24 compute-0 nova_compute[192567]: 2025-10-02 08:12:24.489 2 INFO nova.compute.manager [-] [instance: ed125c83-0f73-41e4-925c-db2354932843] VM Stopped (Lifecycle Event)
Oct 02 08:12:24 compute-0 nova_compute[192567]: 2025-10-02 08:12:24.512 2 DEBUG nova.compute.manager [None req-752daf46-1bd4-44b1-a393-57aa049fdf6c - - - - - -] [instance: ed125c83-0f73-41e4-925c-db2354932843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:12:24 compute-0 nova_compute[192567]: 2025-10-02 08:12:24.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:25 compute-0 nova_compute[192567]: 2025-10-02 08:12:25.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:25 compute-0 nova_compute[192567]: 2025-10-02 08:12:25.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:12:25 compute-0 sshd-session[216468]: Failed password for root from 91.224.92.32 port 33326 ssh2
Oct 02 08:12:26 compute-0 nova_compute[192567]: 2025-10-02 08:12:26.588 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759392731.5872686, b4d496c6-fc60-476d-84fa-b8183df48147 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:12:26 compute-0 nova_compute[192567]: 2025-10-02 08:12:26.588 2 INFO nova.compute.manager [-] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] VM Stopped (Lifecycle Event)
Oct 02 08:12:26 compute-0 nova_compute[192567]: 2025-10-02 08:12:26.617 2 DEBUG nova.compute.manager [None req-aaf6b128-e5b8-47e2-9405-2e58f42b8d37 - - - - - -] [instance: b4d496c6-fc60-476d-84fa-b8183df48147] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:12:27 compute-0 nova_compute[192567]: 2025-10-02 08:12:27.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:12:27 compute-0 nova_compute[192567]: 2025-10-02 08:12:27.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:12:27 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:27.733 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:12:27 compute-0 nova_compute[192567]: 2025-10-02 08:12:27.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:27 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:27.736 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:12:27 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:27.738 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:12:27 compute-0 unix_chkpwd[216556]: password check failed for user (root)
Oct 02 08:12:29 compute-0 nova_compute[192567]: 2025-10-02 08:12:29.092 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759392734.0909226, f13a8d11-bf67-4548-81bb-3bfd210a0471 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:12:29 compute-0 nova_compute[192567]: 2025-10-02 08:12:29.092 2 INFO nova.compute.manager [-] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] VM Stopped (Lifecycle Event)
Oct 02 08:12:29 compute-0 nova_compute[192567]: 2025-10-02 08:12:29.120 2 DEBUG nova.compute.manager [None req-edbb3403-cbe0-4f51-bf0d-bc62859fdf97 - - - - - -] [instance: f13a8d11-bf67-4548-81bb-3bfd210a0471] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:12:29 compute-0 sshd-session[216468]: Failed password for root from 91.224.92.32 port 33326 ssh2
Oct 02 08:12:29 compute-0 nova_compute[192567]: 2025-10-02 08:12:29.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:29 compute-0 podman[203011]: time="2025-10-02T08:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:12:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:12:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2988 "" "Go-http-client/1.1"
Oct 02 08:12:29 compute-0 sshd-session[216468]: Received disconnect from 91.224.92.32 port 33326:11:  [preauth]
Oct 02 08:12:29 compute-0 sshd-session[216468]: Disconnected from authenticating user root 91.224.92.32 port 33326 [preauth]
Oct 02 08:12:29 compute-0 sshd-session[216468]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 02 08:12:30 compute-0 nova_compute[192567]: 2025-10-02 08:12:30.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:30 compute-0 nova_compute[192567]: 2025-10-02 08:12:30.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:12:31 compute-0 openstack_network_exporter[205118]: ERROR   08:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:12:31 compute-0 openstack_network_exporter[205118]: ERROR   08:12:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:12:31 compute-0 openstack_network_exporter[205118]: ERROR   08:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:12:31 compute-0 openstack_network_exporter[205118]: ERROR   08:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:12:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:12:31 compute-0 openstack_network_exporter[205118]: ERROR   08:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:12:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:12:31 compute-0 nova_compute[192567]: 2025-10-02 08:12:31.601 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759392736.5999498, 99ec0256-cf67-4122-81b7-d0767c5a1347 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:12:31 compute-0 nova_compute[192567]: 2025-10-02 08:12:31.602 2 INFO nova.compute.manager [-] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] VM Stopped (Lifecycle Event)
Oct 02 08:12:31 compute-0 nova_compute[192567]: 2025-10-02 08:12:31.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:12:31 compute-0 nova_compute[192567]: 2025-10-02 08:12:31.677 2 DEBUG nova.compute.manager [None req-d096824a-326b-46db-8cae-02816fccc84c - - - - - -] [instance: 99ec0256-cf67-4122-81b7-d0767c5a1347] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:12:32 compute-0 podman[216557]: 2025-10-02 08:12:32.166744665 +0000 UTC m=+0.084102355 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, architecture=x86_64)
Oct 02 08:12:34 compute-0 nova_compute[192567]: 2025-10-02 08:12:34.572 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759392739.5708444, 2e661e5f-2462-4ffd-99a7-afc83d45f425 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:12:34 compute-0 nova_compute[192567]: 2025-10-02 08:12:34.573 2 INFO nova.compute.manager [-] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] VM Stopped (Lifecycle Event)
Oct 02 08:12:34 compute-0 nova_compute[192567]: 2025-10-02 08:12:34.606 2 DEBUG nova.compute.manager [None req-1a858a55-136a-4374-945a-03ce261c527d - - - - - -] [instance: 2e661e5f-2462-4ffd-99a7-afc83d45f425] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:12:34 compute-0 nova_compute[192567]: 2025-10-02 08:12:34.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:35 compute-0 nova_compute[192567]: 2025-10-02 08:12:35.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:39 compute-0 nova_compute[192567]: 2025-10-02 08:12:39.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:40 compute-0 nova_compute[192567]: 2025-10-02 08:12:40.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:42 compute-0 podman[216577]: 2025-10-02 08:12:42.192999954 +0000 UTC m=+0.100234514 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 02 08:12:42 compute-0 podman[216579]: 2025-10-02 08:12:42.19612807 +0000 UTC m=+0.094893567 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct 02 08:12:42 compute-0 podman[216578]: 2025-10-02 08:12:42.230567527 +0000 UTC m=+0.133079210 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:12:44 compute-0 podman[216642]: 2025-10-02 08:12:44.182597714 +0000 UTC m=+0.094155546 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 02 08:12:44 compute-0 nova_compute[192567]: 2025-10-02 08:12:44.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:45 compute-0 nova_compute[192567]: 2025-10-02 08:12:45.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:45.969 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:12:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:45.970 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:12:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:12:45.970 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:12:47 compute-0 podman[216662]: 2025-10-02 08:12:47.173629707 +0000 UTC m=+0.079245904 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:12:49 compute-0 nova_compute[192567]: 2025-10-02 08:12:49.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:50 compute-0 nova_compute[192567]: 2025-10-02 08:12:50.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:54 compute-0 nova_compute[192567]: 2025-10-02 08:12:54.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:55 compute-0 nova_compute[192567]: 2025-10-02 08:12:55.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:59 compute-0 nova_compute[192567]: 2025-10-02 08:12:59.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:12:59 compute-0 podman[203011]: time="2025-10-02T08:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:12:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:12:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2995 "" "Go-http-client/1.1"
Oct 02 08:13:00 compute-0 nova_compute[192567]: 2025-10-02 08:13:00.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:01 compute-0 openstack_network_exporter[205118]: ERROR   08:13:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:13:01 compute-0 openstack_network_exporter[205118]: ERROR   08:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:13:01 compute-0 openstack_network_exporter[205118]: ERROR   08:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:13:01 compute-0 openstack_network_exporter[205118]: ERROR   08:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:13:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:13:01 compute-0 openstack_network_exporter[205118]: ERROR   08:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:13:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:13:03 compute-0 podman[216687]: 2025-10-02 08:13:03.177031451 +0000 UTC m=+0.087062043 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 02 08:13:04 compute-0 nova_compute[192567]: 2025-10-02 08:13:04.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:05 compute-0 ovn_controller[94821]: 2025-10-02T08:13:05Z|00066|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 02 08:13:05 compute-0 nova_compute[192567]: 2025-10-02 08:13:05.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:09 compute-0 nova_compute[192567]: 2025-10-02 08:13:09.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:10 compute-0 nova_compute[192567]: 2025-10-02 08:13:10.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:13 compute-0 podman[216710]: 2025-10-02 08:13:13.22354099 +0000 UTC m=+0.088446766 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 02 08:13:13 compute-0 podman[216712]: 2025-10-02 08:13:13.232286189 +0000 UTC m=+0.087157277 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 02 08:13:13 compute-0 podman[216711]: 2025-10-02 08:13:13.268896162 +0000 UTC m=+0.122851422 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:13:14 compute-0 nova_compute[192567]: 2025-10-02 08:13:14.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:15 compute-0 podman[216772]: 2025-10-02 08:13:15.198783232 +0000 UTC m=+0.095757950 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 02 08:13:15 compute-0 nova_compute[192567]: 2025-10-02 08:13:15.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.139 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Acquiring lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.139 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.154 2 DEBUG nova.compute.manager [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.270 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.271 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.283 2 DEBUG nova.virt.hardware [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.283 2 INFO nova.compute.claims [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.485 2 DEBUG nova.compute.provider_tree [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.504 2 DEBUG nova.scheduler.client.report [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.535 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.536 2 DEBUG nova.compute.manager [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.600 2 DEBUG nova.compute.manager [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.601 2 DEBUG nova.network.neutron [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.634 2 INFO nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.654 2 DEBUG nova.compute.manager [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.764 2 DEBUG nova.compute.manager [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.766 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.767 2 INFO nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Creating image(s)
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.768 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Acquiring lock "/var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.769 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lock "/var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.771 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lock "/var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.801 2 DEBUG oslo_concurrency.processutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.894 2 DEBUG oslo_concurrency.processutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.896 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.896 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:13:16 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.913 2 DEBUG oslo_concurrency.processutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:16.999 2 DEBUG oslo_concurrency.processutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.001 2 DEBUG oslo_concurrency.processutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.043 2 DEBUG oslo_concurrency.processutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.044 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.045 2 DEBUG oslo_concurrency.processutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.142 2 DEBUG oslo_concurrency.processutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.144 2 DEBUG nova.virt.disk.api [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Checking if we can resize image /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.145 2 DEBUG oslo_concurrency.processutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.211 2 DEBUG oslo_concurrency.processutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.212 2 DEBUG nova.virt.disk.api [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Cannot resize image /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.213 2 DEBUG nova.objects.instance [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lazy-loading 'migration_context' on Instance uuid 396bfbd7-9258-4c84-9bdc-a0cb3fa92011 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.235 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.236 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Ensure instance console log exists: /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.237 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.237 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:13:17 compute-0 nova_compute[192567]: 2025-10-02 08:13:17.237 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:13:18 compute-0 podman[216807]: 2025-10-02 08:13:18.15445075 +0000 UTC m=+0.069793603 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:13:19 compute-0 nova_compute[192567]: 2025-10-02 08:13:19.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:19 compute-0 nova_compute[192567]: 2025-10-02 08:13:19.871 2 DEBUG nova.network.neutron [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Successfully created port: b6727183-fb6a-4e44-ab6c-bd72ee94c08c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 02 08:13:20 compute-0 nova_compute[192567]: 2025-10-02 08:13:20.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:20 compute-0 nova_compute[192567]: 2025-10-02 08:13:20.591 2 DEBUG nova.network.neutron [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Successfully updated port: b6727183-fb6a-4e44-ab6c-bd72ee94c08c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 02 08:13:20 compute-0 nova_compute[192567]: 2025-10-02 08:13:20.616 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Acquiring lock "refresh_cache-396bfbd7-9258-4c84-9bdc-a0cb3fa92011" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:13:20 compute-0 nova_compute[192567]: 2025-10-02 08:13:20.617 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Acquired lock "refresh_cache-396bfbd7-9258-4c84-9bdc-a0cb3fa92011" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:13:20 compute-0 nova_compute[192567]: 2025-10-02 08:13:20.617 2 DEBUG nova.network.neutron [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:13:20 compute-0 nova_compute[192567]: 2025-10-02 08:13:20.714 2 DEBUG nova.compute.manager [req-9e2bc734-fff7-4b4d-8071-4b7734648496 req-d685c7f5-a0cb-4f9b-8acc-05a1c4094ef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received event network-changed-b6727183-fb6a-4e44-ab6c-bd72ee94c08c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:13:20 compute-0 nova_compute[192567]: 2025-10-02 08:13:20.714 2 DEBUG nova.compute.manager [req-9e2bc734-fff7-4b4d-8071-4b7734648496 req-d685c7f5-a0cb-4f9b-8acc-05a1c4094ef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Refreshing instance network info cache due to event network-changed-b6727183-fb6a-4e44-ab6c-bd72ee94c08c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:13:20 compute-0 nova_compute[192567]: 2025-10-02 08:13:20.715 2 DEBUG oslo_concurrency.lockutils [req-9e2bc734-fff7-4b4d-8071-4b7734648496 req-d685c7f5-a0cb-4f9b-8acc-05a1c4094ef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-396bfbd7-9258-4c84-9bdc-a0cb3fa92011" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:13:20 compute-0 nova_compute[192567]: 2025-10-02 08:13:20.786 2 DEBUG nova.network.neutron [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:13:21 compute-0 nova_compute[192567]: 2025-10-02 08:13:21.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:13:21 compute-0 nova_compute[192567]: 2025-10-02 08:13:21.656 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:13:21 compute-0 nova_compute[192567]: 2025-10-02 08:13:21.657 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:13:21 compute-0 nova_compute[192567]: 2025-10-02 08:13:21.657 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:13:21 compute-0 nova_compute[192567]: 2025-10-02 08:13:21.657 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:13:21 compute-0 nova_compute[192567]: 2025-10-02 08:13:21.868 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:13:21 compute-0 nova_compute[192567]: 2025-10-02 08:13:21.870 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5877MB free_disk=73.46921157836914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:13:21 compute-0 nova_compute[192567]: 2025-10-02 08:13:21.870 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:13:21 compute-0 nova_compute[192567]: 2025-10-02 08:13:21.871 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:13:21 compute-0 nova_compute[192567]: 2025-10-02 08:13:21.951 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance 396bfbd7-9258-4c84-9bdc-a0cb3fa92011 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:13:21 compute-0 nova_compute[192567]: 2025-10-02 08:13:21.951 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:13:21 compute-0 nova_compute[192567]: 2025-10-02 08:13:21.952 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.015 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.030 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.060 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.060 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.936 2 DEBUG nova.network.neutron [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Updating instance_info_cache with network_info: [{"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.960 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Releasing lock "refresh_cache-396bfbd7-9258-4c84-9bdc-a0cb3fa92011" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.960 2 DEBUG nova.compute.manager [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Instance network_info: |[{"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.961 2 DEBUG oslo_concurrency.lockutils [req-9e2bc734-fff7-4b4d-8071-4b7734648496 req-d685c7f5-a0cb-4f9b-8acc-05a1c4094ef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-396bfbd7-9258-4c84-9bdc-a0cb3fa92011" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.961 2 DEBUG nova.network.neutron [req-9e2bc734-fff7-4b4d-8071-4b7734648496 req-d685c7f5-a0cb-4f9b-8acc-05a1c4094ef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Refreshing network info cache for port b6727183-fb6a-4e44-ab6c-bd72ee94c08c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.966 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Start _get_guest_xml network_info=[{"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.975 2 WARNING nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.980 2 DEBUG nova.virt.libvirt.host [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.981 2 DEBUG nova.virt.libvirt.host [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.991 2 DEBUG nova.virt.libvirt.host [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.991 2 DEBUG nova.virt.libvirt.host [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.992 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.993 2 DEBUG nova.virt.hardware [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.994 2 DEBUG nova.virt.hardware [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.994 2 DEBUG nova.virt.hardware [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.994 2 DEBUG nova.virt.hardware [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.995 2 DEBUG nova.virt.hardware [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.995 2 DEBUG nova.virt.hardware [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.996 2 DEBUG nova.virt.hardware [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.996 2 DEBUG nova.virt.hardware [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.997 2 DEBUG nova.virt.hardware [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:13:22 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.998 2 DEBUG nova.virt.hardware [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:22.999 2 DEBUG nova.virt.hardware [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.006 2 DEBUG nova.virt.libvirt.vif [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:13:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-2086124668',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-2086124668',id=8,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8dd10f364145461c8590b5afcffde8b5',ramdisk_id='',reservation_id='r-8udnez2h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-631276167',owner_user_name='tempest-TestExecuteBasicStrategy-631276167-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:13:16Z,user_data=None,user_id='988f4dc5aba64890a5525fc4a2a95a85',uuid=396bfbd7-9258-4c84-9bdc-a0cb3fa92011,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.007 2 DEBUG nova.network.os_vif_util [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Converting VIF {"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.008 2 DEBUG nova.network.os_vif_util [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:c1:d8,bridge_name='br-int',has_traffic_filtering=True,id=b6727183-fb6a-4e44-ab6c-bd72ee94c08c,network=Network(356cd107-8de5-4a22-a07a-0e1a842e079e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6727183-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.010 2 DEBUG nova.objects.instance [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 396bfbd7-9258-4c84-9bdc-a0cb3fa92011 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.027 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:13:23 compute-0 nova_compute[192567]:   <uuid>396bfbd7-9258-4c84-9bdc-a0cb3fa92011</uuid>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   <name>instance-00000008</name>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteBasicStrategy-server-2086124668</nova:name>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:13:22</nova:creationTime>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:13:23 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:13:23 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:13:23 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:13:23 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:13:23 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:13:23 compute-0 nova_compute[192567]:         <nova:user uuid="988f4dc5aba64890a5525fc4a2a95a85">tempest-TestExecuteBasicStrategy-631276167-project-admin</nova:user>
Oct 02 08:13:23 compute-0 nova_compute[192567]:         <nova:project uuid="8dd10f364145461c8590b5afcffde8b5">tempest-TestExecuteBasicStrategy-631276167</nova:project>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:13:23 compute-0 nova_compute[192567]:         <nova:port uuid="b6727183-fb6a-4e44-ab6c-bd72ee94c08c">
Oct 02 08:13:23 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <system>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <entry name="serial">396bfbd7-9258-4c84-9bdc-a0cb3fa92011</entry>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <entry name="uuid">396bfbd7-9258-4c84-9bdc-a0cb3fa92011</entry>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     </system>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   <os>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   </os>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   <features>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   </features>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk.config"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:88:c1:d8"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <target dev="tapb6727183-fb"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/console.log" append="off"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <video>
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     </video>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:13:23 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:13:23 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:13:23 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:13:23 compute-0 nova_compute[192567]: </domain>
Oct 02 08:13:23 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.029 2 DEBUG nova.compute.manager [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Preparing to wait for external event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.030 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Acquiring lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.030 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.031 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.032 2 DEBUG nova.virt.libvirt.vif [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:13:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-2086124668',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-2086124668',id=8,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8dd10f364145461c8590b5afcffde8b5',ramdisk_id='',reservation_id='r-8udnez2h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-631276167',owner_user_name='tempest-TestExecuteBasicStrategy-631276167-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:13:16Z,user_data=None,user_id='988f4dc5aba64890a5525fc4a2a95a85',uuid=396bfbd7-9258-4c84-9bdc-a0cb3fa92011,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.033 2 DEBUG nova.network.os_vif_util [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Converting VIF {"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.034 2 DEBUG nova.network.os_vif_util [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:c1:d8,bridge_name='br-int',has_traffic_filtering=True,id=b6727183-fb6a-4e44-ab6c-bd72ee94c08c,network=Network(356cd107-8de5-4a22-a07a-0e1a842e079e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6727183-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.034 2 DEBUG os_vif [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:c1:d8,bridge_name='br-int',has_traffic_filtering=True,id=b6727183-fb6a-4e44-ab6c-bd72ee94c08c,network=Network(356cd107-8de5-4a22-a07a-0e1a842e079e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6727183-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.036 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.037 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.042 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb6727183-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.043 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb6727183-fb, col_values=(('external_ids', {'iface-id': 'b6727183-fb6a-4e44-ab6c-bd72ee94c08c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:88:c1:d8', 'vm-uuid': '396bfbd7-9258-4c84-9bdc-a0cb3fa92011'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:23 compute-0 NetworkManager[51654]: <info>  [1759392803.0986] manager: (tapb6727183-fb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.107 2 INFO os_vif [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:c1:d8,bridge_name='br-int',has_traffic_filtering=True,id=b6727183-fb6a-4e44-ab6c-bd72ee94c08c,network=Network(356cd107-8de5-4a22-a07a-0e1a842e079e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6727183-fb')
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.164 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.164 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.165 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] No VIF found with MAC fa:16:3e:88:c1:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.165 2 INFO nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Using config drive
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.919 2 INFO nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Creating config drive at /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk.config
Oct 02 08:13:23 compute-0 nova_compute[192567]: 2025-10-02 08:13:23.923 2 DEBUG oslo_concurrency.processutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr358zn7b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.053 2 DEBUG oslo_concurrency.processutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr358zn7b" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:13:24 compute-0 NetworkManager[51654]: <info>  [1759392804.1411] manager: (tapb6727183-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Oct 02 08:13:24 compute-0 kernel: tapb6727183-fb: entered promiscuous mode
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:24 compute-0 ovn_controller[94821]: 2025-10-02T08:13:24Z|00067|binding|INFO|Claiming lport b6727183-fb6a-4e44-ab6c-bd72ee94c08c for this chassis.
Oct 02 08:13:24 compute-0 ovn_controller[94821]: 2025-10-02T08:13:24Z|00068|binding|INFO|b6727183-fb6a-4e44-ab6c-bd72ee94c08c: Claiming fa:16:3e:88:c1:d8 10.100.0.10
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.216 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:c1:d8 10.100.0.10'], port_security=['fa:16:3e:88:c1:d8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '396bfbd7-9258-4c84-9bdc-a0cb3fa92011', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-356cd107-8de5-4a22-a07a-0e1a842e079e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8dd10f364145461c8590b5afcffde8b5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'eb5f05ba-7cf8-4faa-ab5c-ae003fe00087', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c4fa284-975c-4d95-a8b7-2726666e37c6, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=b6727183-fb6a-4e44-ab6c-bd72ee94c08c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.218 103703 INFO neutron.agent.ovn.metadata.agent [-] Port b6727183-fb6a-4e44-ab6c-bd72ee94c08c in datapath 356cd107-8de5-4a22-a07a-0e1a842e079e bound to our chassis
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.220 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 356cd107-8de5-4a22-a07a-0e1a842e079e
Oct 02 08:13:24 compute-0 systemd-udevd[216852]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:13:24 compute-0 systemd-machined[152597]: New machine qemu-6-instance-00000008.
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.244 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[0d1ff027-cb2a-4016-8f74-5d12d14adee8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.245 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap356cd107-81 in ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.247 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap356cd107-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.247 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[df057475-2f49-434d-867d-fd597989df24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.248 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a54f3e-54fd-4d86-a194-8e86bfa8a825]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 NetworkManager[51654]: <info>  [1759392804.2555] device (tapb6727183-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:13:24 compute-0 NetworkManager[51654]: <info>  [1759392804.2571] device (tapb6727183-fb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.268 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[1c60079c-f039-45b3-b176-e3538cdd0c4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Oct 02 08:13:24 compute-0 ovn_controller[94821]: 2025-10-02T08:13:24Z|00069|binding|INFO|Setting lport b6727183-fb6a-4e44-ab6c-bd72ee94c08c ovn-installed in OVS
Oct 02 08:13:24 compute-0 ovn_controller[94821]: 2025-10-02T08:13:24Z|00070|binding|INFO|Setting lport b6727183-fb6a-4e44-ab6c-bd72ee94c08c up in Southbound
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.289 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[85a7b777-6106-4225-9e0a-07d5d6ebdc9a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.334 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[b1dd4238-e4db-4cae-adeb-cd3f7ff0a106]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 NetworkManager[51654]: <info>  [1759392804.3428] manager: (tap356cd107-80): new Veth device (/org/freedesktop/NetworkManager/Devices/36)
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.341 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[4f46e00b-5da1-4b12-a3f5-05eb4f44669c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 systemd-udevd[216855]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.403 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[5fbcd068-5eb0-443c-8cc3-51ba4759e1aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.408 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf0a3e4-f1e7-4609-9a44-99a94e1e8a07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 NetworkManager[51654]: <info>  [1759392804.4386] device (tap356cd107-80): carrier: link connected
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.444 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[42ab623c-4fba-4ff5-906c-419931f2d788]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.469 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[ae2d4995-4532-450d-b076-356665cee7bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap356cd107-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:ff:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369202, 'reachable_time': 22362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216884, 'error': None, 'target': 'ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.493 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[715ddcb1-06b0-4b14-9c48-afdd8ccc1faa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:ffb0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369202, 'tstamp': 369202}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216885, 'error': None, 'target': 'ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.516 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[52cdbfc8-c462-4cd3-930a-12aeb06ea105]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap356cd107-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:ff:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369202, 'reachable_time': 22362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216886, 'error': None, 'target': 'ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.558 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[249cec6f-08d6-4d69-ac71-f7c30b1f409d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.646 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ee8f11-3dd5-48ed-b1b3-91714b941260]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.648 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap356cd107-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.648 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.649 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap356cd107-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:24 compute-0 NetworkManager[51654]: <info>  [1759392804.6524] manager: (tap356cd107-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Oct 02 08:13:24 compute-0 kernel: tap356cd107-80: entered promiscuous mode
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.658 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap356cd107-80, col_values=(('external_ids', {'iface-id': '365437f0-b97b-4940-9e84-2f717a3eea19'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:24 compute-0 ovn_controller[94821]: 2025-10-02T08:13:24Z|00071|binding|INFO|Releasing lport 365437f0-b97b-4940-9e84-2f717a3eea19 from this chassis (sb_readonly=0)
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.661 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/356cd107-8de5-4a22-a07a-0e1a842e079e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/356cd107-8de5-4a22-a07a-0e1a842e079e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.662 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[598973e3-2bcf-4467-b674-39ebbf89a2f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.663 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-356cd107-8de5-4a22-a07a-0e1a842e079e
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/356cd107-8de5-4a22-a07a-0e1a842e079e.pid.haproxy
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID 356cd107-8de5-4a22-a07a-0e1a842e079e
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:13:24 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:24.664 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e', 'env', 'PROCESS_TAG=haproxy-356cd107-8de5-4a22-a07a-0e1a842e079e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/356cd107-8de5-4a22-a07a-0e1a842e079e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.856 2 DEBUG nova.compute.manager [req-a2b9cbcb-cb16-45e6-8564-5a60ee45da70 req-a1bfe000-c807-47dd-800f-2b78c86ca1ca 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.856 2 DEBUG oslo_concurrency.lockutils [req-a2b9cbcb-cb16-45e6-8564-5a60ee45da70 req-a1bfe000-c807-47dd-800f-2b78c86ca1ca 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.857 2 DEBUG oslo_concurrency.lockutils [req-a2b9cbcb-cb16-45e6-8564-5a60ee45da70 req-a1bfe000-c807-47dd-800f-2b78c86ca1ca 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.857 2 DEBUG oslo_concurrency.lockutils [req-a2b9cbcb-cb16-45e6-8564-5a60ee45da70 req-a1bfe000-c807-47dd-800f-2b78c86ca1ca 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:13:24 compute-0 nova_compute[192567]: 2025-10-02 08:13:24.858 2 DEBUG nova.compute.manager [req-a2b9cbcb-cb16-45e6-8564-5a60ee45da70 req-a1bfe000-c807-47dd-800f-2b78c86ca1ca 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Processing event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.055 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.059 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.060 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.060 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.077 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.078 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.078 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:13:25 compute-0 podman[216922]: 2025-10-02 08:13:25.111633579 +0000 UTC m=+0.082347899 container create 4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.145 2 DEBUG nova.compute.manager [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.147 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392805.1447399, 396bfbd7-9258-4c84-9bdc-a0cb3fa92011 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.148 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] VM Started (Lifecycle Event)
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.152 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.156 2 INFO nova.virt.libvirt.driver [-] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Instance spawned successfully.
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.157 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.169 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:13:25 compute-0 podman[216922]: 2025-10-02 08:13:25.079973387 +0000 UTC m=+0.050687717 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:13:25 compute-0 systemd[1]: Started libpod-conmon-4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712.scope.
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.178 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.183 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.184 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.184 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.185 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.186 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.186 2 DEBUG nova.virt.libvirt.driver [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.194 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.195 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392805.145844, 396bfbd7-9258-4c84-9bdc-a0cb3fa92011 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.195 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] VM Paused (Lifecycle Event)
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.217 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:13:25 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.222 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392805.1516483, 396bfbd7-9258-4c84-9bdc-a0cb3fa92011 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.222 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] VM Resumed (Lifecycle Event)
Oct 02 08:13:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed25cb2ea1f9cfeb4bd494a882b88511f6da3d3e7359297b3e39d1ec2f957cd4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.241 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.245 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.250 2 INFO nova.compute.manager [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Took 8.49 seconds to spawn the instance on the hypervisor.
Oct 02 08:13:25 compute-0 podman[216922]: 2025-10-02 08:13:25.250961936 +0000 UTC m=+0.221676246 container init 4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.251 2 DEBUG nova.compute.manager [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:13:25 compute-0 podman[216922]: 2025-10-02 08:13:25.260671213 +0000 UTC m=+0.231385503 container start 4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.261 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.274 2 DEBUG nova.network.neutron [req-9e2bc734-fff7-4b4d-8071-4b7734648496 req-d685c7f5-a0cb-4f9b-8acc-05a1c4094ef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Updated VIF entry in instance network info cache for port b6727183-fb6a-4e44-ab6c-bd72ee94c08c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.275 2 DEBUG nova.network.neutron [req-9e2bc734-fff7-4b4d-8071-4b7734648496 req-d685c7f5-a0cb-4f9b-8acc-05a1c4094ef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Updating instance_info_cache with network_info: [{"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.298 2 DEBUG oslo_concurrency.lockutils [req-9e2bc734-fff7-4b4d-8071-4b7734648496 req-d685c7f5-a0cb-4f9b-8acc-05a1c4094ef3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-396bfbd7-9258-4c84-9bdc-a0cb3fa92011" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:13:25 compute-0 neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e[216938]: [NOTICE]   (216942) : New worker (216944) forked
Oct 02 08:13:25 compute-0 neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e[216938]: [NOTICE]   (216942) : Loading success.
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.310 2 INFO nova.compute.manager [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Took 9.09 seconds to build instance.
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.328 2 DEBUG oslo_concurrency.lockutils [None req-f1f8d5af-f4a9-4ebe-a1d6-babdfd792151 988f4dc5aba64890a5525fc4a2a95a85 8dd10f364145461c8590b5afcffde8b5 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:13:25 compute-0 nova_compute[192567]: 2025-10-02 08:13:25.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:13:26 compute-0 nova_compute[192567]: 2025-10-02 08:13:26.968 2 DEBUG nova.compute.manager [req-34e384e8-11f8-4710-9c89-270324677276 req-dca217e0-1f31-4273-8251-a5633834438b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:13:26 compute-0 nova_compute[192567]: 2025-10-02 08:13:26.972 2 DEBUG oslo_concurrency.lockutils [req-34e384e8-11f8-4710-9c89-270324677276 req-dca217e0-1f31-4273-8251-a5633834438b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:13:26 compute-0 nova_compute[192567]: 2025-10-02 08:13:26.973 2 DEBUG oslo_concurrency.lockutils [req-34e384e8-11f8-4710-9c89-270324677276 req-dca217e0-1f31-4273-8251-a5633834438b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:13:26 compute-0 nova_compute[192567]: 2025-10-02 08:13:26.973 2 DEBUG oslo_concurrency.lockutils [req-34e384e8-11f8-4710-9c89-270324677276 req-dca217e0-1f31-4273-8251-a5633834438b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:13:26 compute-0 nova_compute[192567]: 2025-10-02 08:13:26.973 2 DEBUG nova.compute.manager [req-34e384e8-11f8-4710-9c89-270324677276 req-dca217e0-1f31-4273-8251-a5633834438b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] No waiting events found dispatching network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:13:26 compute-0 nova_compute[192567]: 2025-10-02 08:13:26.974 2 WARNING nova.compute.manager [req-34e384e8-11f8-4710-9c89-270324677276 req-dca217e0-1f31-4273-8251-a5633834438b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received unexpected event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c for instance with vm_state active and task_state None.
Oct 02 08:13:28 compute-0 nova_compute[192567]: 2025-10-02 08:13:28.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:29 compute-0 nova_compute[192567]: 2025-10-02 08:13:29.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:13:29 compute-0 nova_compute[192567]: 2025-10-02 08:13:29.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:13:29 compute-0 podman[203011]: time="2025-10-02T08:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:13:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:13:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3455 "" "Go-http-client/1.1"
Oct 02 08:13:30 compute-0 nova_compute[192567]: 2025-10-02 08:13:30.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:30 compute-0 nova_compute[192567]: 2025-10-02 08:13:30.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:13:31 compute-0 openstack_network_exporter[205118]: ERROR   08:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:13:31 compute-0 openstack_network_exporter[205118]: ERROR   08:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:13:31 compute-0 openstack_network_exporter[205118]: ERROR   08:13:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:13:31 compute-0 openstack_network_exporter[205118]: ERROR   08:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:13:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:13:31 compute-0 openstack_network_exporter[205118]: ERROR   08:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:13:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:13:31 compute-0 nova_compute[192567]: 2025-10-02 08:13:31.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:13:31 compute-0 nova_compute[192567]: 2025-10-02 08:13:31.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:13:33 compute-0 nova_compute[192567]: 2025-10-02 08:13:33.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:34 compute-0 podman[216953]: 2025-10-02 08:13:34.199268185 +0000 UTC m=+0.105355895 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:13:35 compute-0 nova_compute[192567]: 2025-10-02 08:13:35.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:36 compute-0 ovn_controller[94821]: 2025-10-02T08:13:36Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:88:c1:d8 10.100.0.10
Oct 02 08:13:36 compute-0 ovn_controller[94821]: 2025-10-02T08:13:36Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:88:c1:d8 10.100.0.10
Oct 02 08:13:38 compute-0 nova_compute[192567]: 2025-10-02 08:13:38.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:40 compute-0 nova_compute[192567]: 2025-10-02 08:13:40.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:43 compute-0 nova_compute[192567]: 2025-10-02 08:13:43.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:44 compute-0 podman[216994]: 2025-10-02 08:13:44.207428176 +0000 UTC m=+0.109340556 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 02 08:13:44 compute-0 podman[216992]: 2025-10-02 08:13:44.225866763 +0000 UTC m=+0.128845186 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 02 08:13:44 compute-0 podman[216993]: 2025-10-02 08:13:44.265553361 +0000 UTC m=+0.168612557 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251001)
Oct 02 08:13:45 compute-0 nova_compute[192567]: 2025-10-02 08:13:45.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:45.970 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:13:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:45.971 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:13:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:13:45.972 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:13:46 compute-0 podman[217060]: 2025-10-02 08:13:46.1787882 +0000 UTC m=+0.083578476 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:13:48 compute-0 nova_compute[192567]: 2025-10-02 08:13:48.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:49 compute-0 podman[217081]: 2025-10-02 08:13:49.195619965 +0000 UTC m=+0.098718751 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:13:50 compute-0 nova_compute[192567]: 2025-10-02 08:13:50.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:52 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 02 08:13:53 compute-0 nova_compute[192567]: 2025-10-02 08:13:53.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:54 compute-0 ovn_controller[94821]: 2025-10-02T08:13:54Z|00072|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Oct 02 08:13:55 compute-0 nova_compute[192567]: 2025-10-02 08:13:55.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:58 compute-0 nova_compute[192567]: 2025-10-02 08:13:58.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:13:59 compute-0 podman[203011]: time="2025-10-02T08:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:13:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:13:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3454 "" "Go-http-client/1.1"
Oct 02 08:14:00 compute-0 nova_compute[192567]: 2025-10-02 08:14:00.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:01 compute-0 openstack_network_exporter[205118]: ERROR   08:14:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:14:01 compute-0 openstack_network_exporter[205118]: ERROR   08:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:14:01 compute-0 openstack_network_exporter[205118]: ERROR   08:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:14:01 compute-0 openstack_network_exporter[205118]: ERROR   08:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:14:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:14:01 compute-0 openstack_network_exporter[205118]: ERROR   08:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:14:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:14:03 compute-0 nova_compute[192567]: 2025-10-02 08:14:03.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:05 compute-0 podman[217109]: 2025-10-02 08:14:05.198165941 +0000 UTC m=+0.100513367 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm)
Oct 02 08:14:05 compute-0 nova_compute[192567]: 2025-10-02 08:14:05.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:08 compute-0 nova_compute[192567]: 2025-10-02 08:14:08.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:10 compute-0 nova_compute[192567]: 2025-10-02 08:14:10.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:13 compute-0 nova_compute[192567]: 2025-10-02 08:14:13.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:15 compute-0 podman[217131]: 2025-10-02 08:14:15.172108833 +0000 UTC m=+0.086259479 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Oct 02 08:14:15 compute-0 podman[217133]: 2025-10-02 08:14:15.193902561 +0000 UTC m=+0.095246914 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd)
Oct 02 08:14:15 compute-0 podman[217132]: 2025-10-02 08:14:15.230219846 +0000 UTC m=+0.131026163 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:14:15 compute-0 nova_compute[192567]: 2025-10-02 08:14:15.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:17 compute-0 podman[217194]: 2025-10-02 08:14:17.186352422 +0000 UTC m=+0.092357335 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 02 08:14:18 compute-0 nova_compute[192567]: 2025-10-02 08:14:18.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:20 compute-0 podman[217214]: 2025-10-02 08:14:20.16218229 +0000 UTC m=+0.075500159 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:14:20 compute-0 nova_compute[192567]: 2025-10-02 08:14:20.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:23 compute-0 nova_compute[192567]: 2025-10-02 08:14:23.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:23 compute-0 nova_compute[192567]: 2025-10-02 08:14:23.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:14:23 compute-0 nova_compute[192567]: 2025-10-02 08:14:23.666 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:14:23 compute-0 nova_compute[192567]: 2025-10-02 08:14:23.667 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:14:23 compute-0 nova_compute[192567]: 2025-10-02 08:14:23.668 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:14:23 compute-0 nova_compute[192567]: 2025-10-02 08:14:23.669 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:14:23 compute-0 nova_compute[192567]: 2025-10-02 08:14:23.752 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:14:23 compute-0 nova_compute[192567]: 2025-10-02 08:14:23.839 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:14:23 compute-0 nova_compute[192567]: 2025-10-02 08:14:23.841 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:14:23 compute-0 nova_compute[192567]: 2025-10-02 08:14:23.932 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:14:24 compute-0 nova_compute[192567]: 2025-10-02 08:14:24.207 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:14:24 compute-0 nova_compute[192567]: 2025-10-02 08:14:24.209 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5709MB free_disk=73.44063186645508GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:14:24 compute-0 nova_compute[192567]: 2025-10-02 08:14:24.209 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:14:24 compute-0 nova_compute[192567]: 2025-10-02 08:14:24.209 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:14:24 compute-0 nova_compute[192567]: 2025-10-02 08:14:24.291 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance 396bfbd7-9258-4c84-9bdc-a0cb3fa92011 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:14:24 compute-0 nova_compute[192567]: 2025-10-02 08:14:24.291 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:14:24 compute-0 nova_compute[192567]: 2025-10-02 08:14:24.291 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:14:24 compute-0 nova_compute[192567]: 2025-10-02 08:14:24.341 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:14:24 compute-0 nova_compute[192567]: 2025-10-02 08:14:24.359 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:14:24 compute-0 nova_compute[192567]: 2025-10-02 08:14:24.384 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:14:24 compute-0 nova_compute[192567]: 2025-10-02 08:14:24.385 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:14:25 compute-0 nova_compute[192567]: 2025-10-02 08:14:25.380 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:14:25 compute-0 nova_compute[192567]: 2025-10-02 08:14:25.383 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:14:25 compute-0 nova_compute[192567]: 2025-10-02 08:14:25.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:14:25 compute-0 nova_compute[192567]: 2025-10-02 08:14:25.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:14:25 compute-0 nova_compute[192567]: 2025-10-02 08:14:25.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:14:25 compute-0 nova_compute[192567]: 2025-10-02 08:14:25.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:26 compute-0 nova_compute[192567]: 2025-10-02 08:14:26.773 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-396bfbd7-9258-4c84-9bdc-a0cb3fa92011" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:14:26 compute-0 nova_compute[192567]: 2025-10-02 08:14:26.774 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-396bfbd7-9258-4c84-9bdc-a0cb3fa92011" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:14:26 compute-0 nova_compute[192567]: 2025-10-02 08:14:26.775 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:14:26 compute-0 nova_compute[192567]: 2025-10-02 08:14:26.775 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 396bfbd7-9258-4c84-9bdc-a0cb3fa92011 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:14:28 compute-0 nova_compute[192567]: 2025-10-02 08:14:28.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:29 compute-0 podman[203011]: time="2025-10-02T08:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:14:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:14:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3463 "" "Go-http-client/1.1"
Oct 02 08:14:29 compute-0 nova_compute[192567]: 2025-10-02 08:14:29.996 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Updating instance_info_cache with network_info: [{"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:14:30 compute-0 nova_compute[192567]: 2025-10-02 08:14:30.015 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-396bfbd7-9258-4c84-9bdc-a0cb3fa92011" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:14:30 compute-0 nova_compute[192567]: 2025-10-02 08:14:30.016 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:14:30 compute-0 nova_compute[192567]: 2025-10-02 08:14:30.016 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:14:30 compute-0 nova_compute[192567]: 2025-10-02 08:14:30.017 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:14:30 compute-0 nova_compute[192567]: 2025-10-02 08:14:30.017 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:14:30 compute-0 nova_compute[192567]: 2025-10-02 08:14:30.017 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:14:30 compute-0 nova_compute[192567]: 2025-10-02 08:14:30.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:31 compute-0 openstack_network_exporter[205118]: ERROR   08:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:14:31 compute-0 openstack_network_exporter[205118]: ERROR   08:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:14:31 compute-0 openstack_network_exporter[205118]: ERROR   08:14:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:14:31 compute-0 openstack_network_exporter[205118]: ERROR   08:14:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:14:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:14:31 compute-0 openstack_network_exporter[205118]: ERROR   08:14:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:14:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:14:31 compute-0 nova_compute[192567]: 2025-10-02 08:14:31.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:14:33 compute-0 nova_compute[192567]: 2025-10-02 08:14:33.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:33 compute-0 nova_compute[192567]: 2025-10-02 08:14:33.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:14:35 compute-0 nova_compute[192567]: 2025-10-02 08:14:35.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:36 compute-0 podman[217245]: 2025-10-02 08:14:36.194266762 +0000 UTC m=+0.095290805 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=edpm, name=ubi9-minimal, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, distribution-scope=public, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Oct 02 08:14:38 compute-0 nova_compute[192567]: 2025-10-02 08:14:38.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:39 compute-0 nova_compute[192567]: 2025-10-02 08:14:39.959 2 DEBUG nova.compute.manager [None req-9441872f-43ba-4672-8dba-db2b0adbc999 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Oct 02 08:14:40 compute-0 nova_compute[192567]: 2025-10-02 08:14:40.039 2 DEBUG nova.compute.provider_tree [None req-9441872f-43ba-4672-8dba-db2b0adbc999 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Updating resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e generation from 4 to 10 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 02 08:14:40 compute-0 nova_compute[192567]: 2025-10-02 08:14:40.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:43 compute-0 nova_compute[192567]: 2025-10-02 08:14:43.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:43 compute-0 nova_compute[192567]: 2025-10-02 08:14:43.731 2 DEBUG nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Check if temp file /var/lib/nova/instances/tmp_mb7c64k exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Oct 02 08:14:43 compute-0 nova_compute[192567]: 2025-10-02 08:14:43.732 2 DEBUG nova.compute.manager [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_mb7c64k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='396bfbd7-9258-4c84-9bdc-a0cb3fa92011',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Oct 02 08:14:45 compute-0 nova_compute[192567]: 2025-10-02 08:14:45.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:45.972 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:14:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:45.972 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:14:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:45.973 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:14:46 compute-0 nova_compute[192567]: 2025-10-02 08:14:46.022 2 DEBUG oslo_concurrency.processutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:14:46 compute-0 nova_compute[192567]: 2025-10-02 08:14:46.132 2 DEBUG oslo_concurrency.processutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk --force-share --output=json" returned: 0 in 0.109s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:14:46 compute-0 nova_compute[192567]: 2025-10-02 08:14:46.133 2 DEBUG oslo_concurrency.processutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:14:46 compute-0 podman[217269]: 2025-10-02 08:14:46.21336597 +0000 UTC m=+0.097687231 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Oct 02 08:14:46 compute-0 nova_compute[192567]: 2025-10-02 08:14:46.211 2 DEBUG oslo_concurrency.processutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:14:46 compute-0 podman[217267]: 2025-10-02 08:14:46.214062491 +0000 UTC m=+0.121490640 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:14:46 compute-0 podman[217268]: 2025-10-02 08:14:46.235425847 +0000 UTC m=+0.129522817 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Oct 02 08:14:48 compute-0 podman[217337]: 2025-10-02 08:14:48.179721309 +0000 UTC m=+0.086300030 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 02 08:14:48 compute-0 nova_compute[192567]: 2025-10-02 08:14:48.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:48 compute-0 sshd-session[217357]: Accepted publickey for nova from 192.168.122.101 port 42306 ssh2: ECDSA SHA256:nyj9easCU2+zJyxXdAvgdE/0ePVxCLkFf7X2/rv3WZg
Oct 02 08:14:48 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Oct 02 08:14:48 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct 02 08:14:48 compute-0 systemd-logind[827]: New session 35 of user nova.
Oct 02 08:14:48 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct 02 08:14:48 compute-0 systemd[1]: Starting User Manager for UID 42436...
Oct 02 08:14:48 compute-0 systemd[217361]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:14:49 compute-0 systemd[217361]: Queued start job for default target Main User Target.
Oct 02 08:14:49 compute-0 systemd[217361]: Created slice User Application Slice.
Oct 02 08:14:49 compute-0 systemd[217361]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 02 08:14:49 compute-0 systemd[217361]: Started Daily Cleanup of User's Temporary Directories.
Oct 02 08:14:49 compute-0 systemd[217361]: Reached target Paths.
Oct 02 08:14:49 compute-0 systemd[217361]: Reached target Timers.
Oct 02 08:14:49 compute-0 systemd[217361]: Starting D-Bus User Message Bus Socket...
Oct 02 08:14:49 compute-0 systemd[217361]: Starting Create User's Volatile Files and Directories...
Oct 02 08:14:49 compute-0 systemd[217361]: Listening on D-Bus User Message Bus Socket.
Oct 02 08:14:49 compute-0 systemd[217361]: Reached target Sockets.
Oct 02 08:14:49 compute-0 systemd[217361]: Finished Create User's Volatile Files and Directories.
Oct 02 08:14:49 compute-0 systemd[217361]: Reached target Basic System.
Oct 02 08:14:49 compute-0 systemd[217361]: Reached target Main User Target.
Oct 02 08:14:49 compute-0 systemd[217361]: Startup finished in 184ms.
Oct 02 08:14:49 compute-0 systemd[1]: Started User Manager for UID 42436.
Oct 02 08:14:49 compute-0 systemd[1]: Started Session 35 of User nova.
Oct 02 08:14:49 compute-0 sshd-session[217357]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:14:49 compute-0 sshd-session[217376]: Received disconnect from 192.168.122.101 port 42306:11: disconnected by user
Oct 02 08:14:49 compute-0 sshd-session[217376]: Disconnected from user nova 192.168.122.101 port 42306
Oct 02 08:14:49 compute-0 sshd-session[217357]: pam_unix(sshd:session): session closed for user nova
Oct 02 08:14:49 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Oct 02 08:14:49 compute-0 systemd-logind[827]: Session 35 logged out. Waiting for processes to exit.
Oct 02 08:14:49 compute-0 systemd-logind[827]: Removed session 35.
Oct 02 08:14:50 compute-0 nova_compute[192567]: 2025-10-02 08:14:50.458 2 DEBUG nova.compute.manager [req-370e0f63-ea21-40f2-b2e7-743ce35b4957 req-fd854578-0d33-44f2-ae5e-53d229bb7f9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received event network-vif-unplugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:14:50 compute-0 nova_compute[192567]: 2025-10-02 08:14:50.460 2 DEBUG oslo_concurrency.lockutils [req-370e0f63-ea21-40f2-b2e7-743ce35b4957 req-fd854578-0d33-44f2-ae5e-53d229bb7f9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:14:50 compute-0 nova_compute[192567]: 2025-10-02 08:14:50.461 2 DEBUG oslo_concurrency.lockutils [req-370e0f63-ea21-40f2-b2e7-743ce35b4957 req-fd854578-0d33-44f2-ae5e-53d229bb7f9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:14:50 compute-0 nova_compute[192567]: 2025-10-02 08:14:50.461 2 DEBUG oslo_concurrency.lockutils [req-370e0f63-ea21-40f2-b2e7-743ce35b4957 req-fd854578-0d33-44f2-ae5e-53d229bb7f9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:14:50 compute-0 nova_compute[192567]: 2025-10-02 08:14:50.461 2 DEBUG nova.compute.manager [req-370e0f63-ea21-40f2-b2e7-743ce35b4957 req-fd854578-0d33-44f2-ae5e-53d229bb7f9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] No waiting events found dispatching network-vif-unplugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:14:50 compute-0 nova_compute[192567]: 2025-10-02 08:14:50.462 2 DEBUG nova.compute.manager [req-370e0f63-ea21-40f2-b2e7-743ce35b4957 req-fd854578-0d33-44f2-ae5e-53d229bb7f9b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received event network-vif-unplugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:14:50 compute-0 nova_compute[192567]: 2025-10-02 08:14:50.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:50.466 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:14:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:50.468 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:14:50 compute-0 nova_compute[192567]: 2025-10-02 08:14:50.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:51 compute-0 podman[217378]: 2025-10-02 08:14:51.197477123 +0000 UTC m=+0.097195765 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:14:51 compute-0 nova_compute[192567]: 2025-10-02 08:14:51.810 2 INFO nova.compute.manager [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Took 5.60 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Oct 02 08:14:51 compute-0 nova_compute[192567]: 2025-10-02 08:14:51.811 2 DEBUG nova.compute.manager [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:14:51 compute-0 nova_compute[192567]: 2025-10-02 08:14:51.827 2 DEBUG nova.compute.manager [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_mb7c64k',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='396bfbd7-9258-4c84-9bdc-a0cb3fa92011',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(30a29276-62f8-446f-a0a4-6129584becae),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Oct 02 08:14:51 compute-0 nova_compute[192567]: 2025-10-02 08:14:51.852 2 DEBUG nova.objects.instance [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 396bfbd7-9258-4c84-9bdc-a0cb3fa92011 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:14:51 compute-0 nova_compute[192567]: 2025-10-02 08:14:51.854 2 DEBUG nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Oct 02 08:14:51 compute-0 nova_compute[192567]: 2025-10-02 08:14:51.856 2 DEBUG nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Oct 02 08:14:51 compute-0 nova_compute[192567]: 2025-10-02 08:14:51.856 2 DEBUG nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Oct 02 08:14:51 compute-0 nova_compute[192567]: 2025-10-02 08:14:51.871 2 DEBUG nova.virt.libvirt.vif [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:13:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-2086124668',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-2086124668',id=8,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:13:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8dd10f364145461c8590b5afcffde8b5',ramdisk_id='',reservation_id='r-8udnez2h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-631276167',owner_user_name='tempest-TestExecuteBasicStrategy-631276167-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:13:25Z,user_data=None,user_id='988f4dc5aba64890a5525fc4a2a95a85',uuid=396bfbd7-9258-4c84-9bdc-a0cb3fa92011,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:14:51 compute-0 nova_compute[192567]: 2025-10-02 08:14:51.871 2 DEBUG nova.network.os_vif_util [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:14:51 compute-0 nova_compute[192567]: 2025-10-02 08:14:51.872 2 DEBUG nova.network.os_vif_util [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:88:c1:d8,bridge_name='br-int',has_traffic_filtering=True,id=b6727183-fb6a-4e44-ab6c-bd72ee94c08c,network=Network(356cd107-8de5-4a22-a07a-0e1a842e079e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6727183-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:14:51 compute-0 nova_compute[192567]: 2025-10-02 08:14:51.873 2 DEBUG nova.virt.libvirt.migration [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Updating guest XML with vif config: <interface type="ethernet">
Oct 02 08:14:51 compute-0 nova_compute[192567]:   <mac address="fa:16:3e:88:c1:d8"/>
Oct 02 08:14:51 compute-0 nova_compute[192567]:   <model type="virtio"/>
Oct 02 08:14:51 compute-0 nova_compute[192567]:   <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:14:51 compute-0 nova_compute[192567]:   <mtu size="1442"/>
Oct 02 08:14:51 compute-0 nova_compute[192567]:   <target dev="tapb6727183-fb"/>
Oct 02 08:14:51 compute-0 nova_compute[192567]: </interface>
Oct 02 08:14:51 compute-0 nova_compute[192567]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Oct 02 08:14:51 compute-0 nova_compute[192567]: 2025-10-02 08:14:51.874 2 DEBUG nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.360 2 DEBUG nova.virt.libvirt.migration [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.361 2 INFO nova.virt.libvirt.migration [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Increasing downtime to 50 ms after 0 sec elapsed time
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.427 2 INFO nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.592 2 DEBUG nova.compute.manager [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.593 2 DEBUG oslo_concurrency.lockutils [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.593 2 DEBUG oslo_concurrency.lockutils [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.593 2 DEBUG oslo_concurrency.lockutils [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.593 2 DEBUG nova.compute.manager [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] No waiting events found dispatching network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.593 2 WARNING nova.compute.manager [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received unexpected event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c for instance with vm_state active and task_state migrating.
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.594 2 DEBUG nova.compute.manager [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received event network-changed-b6727183-fb6a-4e44-ab6c-bd72ee94c08c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.594 2 DEBUG nova.compute.manager [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Refreshing instance network info cache due to event network-changed-b6727183-fb6a-4e44-ab6c-bd72ee94c08c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.594 2 DEBUG oslo_concurrency.lockutils [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-396bfbd7-9258-4c84-9bdc-a0cb3fa92011" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.594 2 DEBUG oslo_concurrency.lockutils [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-396bfbd7-9258-4c84-9bdc-a0cb3fa92011" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.594 2 DEBUG nova.network.neutron [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Refreshing network info cache for port b6727183-fb6a-4e44-ab6c-bd72ee94c08c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.930 2 DEBUG nova.virt.libvirt.migration [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:14:52 compute-0 nova_compute[192567]: 2025-10-02 08:14:52.931 2 DEBUG nova.virt.libvirt.migration [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:14:53 compute-0 nova_compute[192567]: 2025-10-02 08:14:53.434 2 DEBUG nova.virt.libvirt.migration [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:14:53 compute-0 nova_compute[192567]: 2025-10-02 08:14:53.435 2 DEBUG nova.virt.libvirt.migration [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:14:53 compute-0 nova_compute[192567]: 2025-10-02 08:14:53.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:53 compute-0 nova_compute[192567]: 2025-10-02 08:14:53.868 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759392893.8676305, 396bfbd7-9258-4c84-9bdc-a0cb3fa92011 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:14:53 compute-0 nova_compute[192567]: 2025-10-02 08:14:53.869 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] VM Paused (Lifecycle Event)
Oct 02 08:14:53 compute-0 nova_compute[192567]: 2025-10-02 08:14:53.902 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:14:53 compute-0 nova_compute[192567]: 2025-10-02 08:14:53.908 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:14:53 compute-0 nova_compute[192567]: 2025-10-02 08:14:53.939 2 DEBUG nova.virt.libvirt.migration [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:14:53 compute-0 nova_compute[192567]: 2025-10-02 08:14:53.940 2 DEBUG nova.virt.libvirt.migration [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:14:53 compute-0 nova_compute[192567]: 2025-10-02 08:14:53.941 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] During sync_power_state the instance has a pending task (migrating). Skip.
Oct 02 08:14:54 compute-0 kernel: tapb6727183-fb (unregistering): left promiscuous mode
Oct 02 08:14:54 compute-0 NetworkManager[51654]: <info>  [1759392894.0454] device (tapb6727183-fb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:14:54 compute-0 ovn_controller[94821]: 2025-10-02T08:14:54Z|00073|binding|INFO|Releasing lport b6727183-fb6a-4e44-ab6c-bd72ee94c08c from this chassis (sb_readonly=0)
Oct 02 08:14:54 compute-0 ovn_controller[94821]: 2025-10-02T08:14:54Z|00074|binding|INFO|Setting lport b6727183-fb6a-4e44-ab6c-bd72ee94c08c down in Southbound
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:54 compute-0 ovn_controller[94821]: 2025-10-02T08:14:54Z|00075|binding|INFO|Removing iface tapb6727183-fb ovn-installed in OVS
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.067 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:c1:d8 10.100.0.10'], port_security=['fa:16:3e:88:c1:d8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '61f597a0-da80-455c-aab0-956a1e15f143'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '396bfbd7-9258-4c84-9bdc-a0cb3fa92011', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-356cd107-8de5-4a22-a07a-0e1a842e079e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8dd10f364145461c8590b5afcffde8b5', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'eb5f05ba-7cf8-4faa-ab5c-ae003fe00087', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c4fa284-975c-4d95-a8b7-2726666e37c6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=b6727183-fb6a-4e44-ab6c-bd72ee94c08c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.068 103703 INFO neutron.agent.ovn.metadata.agent [-] Port b6727183-fb6a-4e44-ab6c-bd72ee94c08c in datapath 356cd107-8de5-4a22-a07a-0e1a842e079e unbound from our chassis
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.069 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 356cd107-8de5-4a22-a07a-0e1a842e079e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.071 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[2d512864-2015-4e4f-aadf-b43b13a12b29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.072 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e namespace which is not needed anymore
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:54 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Oct 02 08:14:54 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 16.148s CPU time.
Oct 02 08:14:54 compute-0 systemd-machined[152597]: Machine qemu-6-instance-00000008 terminated.
Oct 02 08:14:54 compute-0 neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e[216938]: [NOTICE]   (216942) : haproxy version is 2.8.14-c23fe91
Oct 02 08:14:54 compute-0 neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e[216938]: [NOTICE]   (216942) : path to executable is /usr/sbin/haproxy
Oct 02 08:14:54 compute-0 neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e[216938]: [WARNING]  (216942) : Exiting Master process...
Oct 02 08:14:54 compute-0 neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e[216938]: [WARNING]  (216942) : Exiting Master process...
Oct 02 08:14:54 compute-0 neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e[216938]: [ALERT]    (216942) : Current worker (216944) exited with code 143 (Terminated)
Oct 02 08:14:54 compute-0 neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e[216938]: [WARNING]  (216942) : All workers exited. Exiting... (0)
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:54 compute-0 systemd[1]: libpod-4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712.scope: Deactivated successfully.
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:54 compute-0 podman[217447]: 2025-10-02 08:14:54.261436055 +0000 UTC m=+0.055620129 container died 4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 02 08:14:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712-userdata-shm.mount: Deactivated successfully.
Oct 02 08:14:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-ed25cb2ea1f9cfeb4bd494a882b88511f6da3d3e7359297b3e39d1ec2f957cd4-merged.mount: Deactivated successfully.
Oct 02 08:14:54 compute-0 podman[217447]: 2025-10-02 08:14:54.304570869 +0000 UTC m=+0.098754943 container cleanup 4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.324 2 DEBUG nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.325 2 DEBUG nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.325 2 DEBUG nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Oct 02 08:14:54 compute-0 systemd[1]: libpod-conmon-4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712.scope: Deactivated successfully.
Oct 02 08:14:54 compute-0 podman[217490]: 2025-10-02 08:14:54.385478732 +0000 UTC m=+0.049967595 container remove 4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.393 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[04f084ce-f888-4968-b797-1cdf0608c2e1]: (4, ('Thu Oct  2 08:14:54 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e (4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712)\n4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712\nThu Oct  2 08:14:54 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e (4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712)\n4b93bed33d188577a3615f0405a417772d8172fbe88ded3374815ec191550712\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.395 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[42a7c3ef-205e-4181-94c2-7743a2194e42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.397 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap356cd107-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:54 compute-0 kernel: tap356cd107-80: left promiscuous mode
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.468 2 DEBUG nova.virt.libvirt.guest [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '396bfbd7-9258-4c84-9bdc-a0cb3fa92011' (instance-00000008) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.470 2 INFO nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Migration operation has completed
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.470 2 INFO nova.compute.manager [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] _post_live_migration() is started..
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.469 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.471 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[dc6b2e2e-607c-4379-8746-589a62d88a97]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.509 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[ccf80e9f-48a6-4329-bfe7-d537648b3103]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.511 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8af5ff57-cc92-4b7b-bdce-dc98ca841b70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.534 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[12c3ddd5-17cf-4e8c-8427-5e71f236cb54]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369191, 'reachable_time': 31932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217509, 'error': None, 'target': 'ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.537 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-356cd107-8de5-4a22-a07a-0e1a842e079e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:14:54 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:14:54.537 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[239d6fb2-0476-47b7-872e-e622639253be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:14:54 compute-0 systemd[1]: run-netns-ovnmeta\x2d356cd107\x2d8de5\x2d4a22\x2da07a\x2d0e1a842e079e.mount: Deactivated successfully.
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.998 2 DEBUG nova.compute.manager [req-29780b2b-2a3a-43b7-a84e-cc9139e41fa0 req-8aa21b01-c78c-4e05-bc40-04ba55e235c5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received event network-vif-unplugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.998 2 DEBUG oslo_concurrency.lockutils [req-29780b2b-2a3a-43b7-a84e-cc9139e41fa0 req-8aa21b01-c78c-4e05-bc40-04ba55e235c5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:14:54 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.999 2 DEBUG oslo_concurrency.lockutils [req-29780b2b-2a3a-43b7-a84e-cc9139e41fa0 req-8aa21b01-c78c-4e05-bc40-04ba55e235c5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:14:55 compute-0 nova_compute[192567]: 2025-10-02 08:14:54.999 2 DEBUG oslo_concurrency.lockutils [req-29780b2b-2a3a-43b7-a84e-cc9139e41fa0 req-8aa21b01-c78c-4e05-bc40-04ba55e235c5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:14:55 compute-0 nova_compute[192567]: 2025-10-02 08:14:55.000 2 DEBUG nova.compute.manager [req-29780b2b-2a3a-43b7-a84e-cc9139e41fa0 req-8aa21b01-c78c-4e05-bc40-04ba55e235c5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] No waiting events found dispatching network-vif-unplugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:14:55 compute-0 nova_compute[192567]: 2025-10-02 08:14:55.000 2 DEBUG nova.compute.manager [req-29780b2b-2a3a-43b7-a84e-cc9139e41fa0 req-8aa21b01-c78c-4e05-bc40-04ba55e235c5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received event network-vif-unplugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:14:55 compute-0 nova_compute[192567]: 2025-10-02 08:14:55.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.238 2 DEBUG nova.network.neutron [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Activated binding for port b6727183-fb6a-4e44-ab6c-bd72ee94c08c and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.239 2 DEBUG nova.compute.manager [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.240 2 DEBUG nova.virt.libvirt.vif [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:13:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-2086124668',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-2086124668',id=8,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:13:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8dd10f364145461c8590b5afcffde8b5',ramdisk_id='',reservation_id='r-8udnez2h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-631276167',owner_user_name='tempest-TestExecuteBasicStrategy-631276167-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:14:41Z,user_data=None,user_id='988f4dc5aba64890a5525fc4a2a95a85',uuid=396bfbd7-9258-4c84-9bdc-a0cb3fa92011,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.240 2 DEBUG nova.network.os_vif_util [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.240 2 DEBUG nova.network.os_vif_util [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:88:c1:d8,bridge_name='br-int',has_traffic_filtering=True,id=b6727183-fb6a-4e44-ab6c-bd72ee94c08c,network=Network(356cd107-8de5-4a22-a07a-0e1a842e079e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6727183-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.241 2 DEBUG os_vif [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:c1:d8,bridge_name='br-int',has_traffic_filtering=True,id=b6727183-fb6a-4e44-ab6c-bd72ee94c08c,network=Network(356cd107-8de5-4a22-a07a-0e1a842e079e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6727183-fb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.243 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb6727183-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.249 2 INFO os_vif [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:c1:d8,bridge_name='br-int',has_traffic_filtering=True,id=b6727183-fb6a-4e44-ab6c-bd72ee94c08c,network=Network(356cd107-8de5-4a22-a07a-0e1a842e079e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6727183-fb')
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.249 2 DEBUG oslo_concurrency.lockutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.249 2 DEBUG oslo_concurrency.lockutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.249 2 DEBUG oslo_concurrency.lockutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.250 2 DEBUG nova.compute.manager [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.250 2 INFO nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Deleting instance files /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011_del
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.251 2 INFO nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Deletion of /var/lib/nova/instances/396bfbd7-9258-4c84-9bdc-a0cb3fa92011_del complete
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.287 2 DEBUG nova.network.neutron [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Updated VIF entry in instance network info cache for port b6727183-fb6a-4e44-ab6c-bd72ee94c08c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.287 2 DEBUG nova.network.neutron [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Updating instance_info_cache with network_info: [{"id": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "address": "fa:16:3e:88:c1:d8", "network": {"id": "356cd107-8de5-4a22-a07a-0e1a842e079e", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-523811180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8b4fd811214f48488a511e0a8a1fed62", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6727183-fb", "ovs_interfaceid": "b6727183-fb6a-4e44-ab6c-bd72ee94c08c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:14:56 compute-0 nova_compute[192567]: 2025-10-02 08:14:56.324 2 DEBUG oslo_concurrency.lockutils [req-cdbc53ec-63df-4fd6-9d74-8ebf3e421ca1 req-f4aab882-26bc-429b-8eab-ec739ae35fc3 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-396bfbd7-9258-4c84-9bdc-a0cb3fa92011" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.106 2 DEBUG nova.compute.manager [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.106 2 DEBUG oslo_concurrency.lockutils [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.107 2 DEBUG oslo_concurrency.lockutils [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.107 2 DEBUG oslo_concurrency.lockutils [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.108 2 DEBUG nova.compute.manager [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] No waiting events found dispatching network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.108 2 WARNING nova.compute.manager [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received unexpected event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c for instance with vm_state active and task_state migrating.
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.108 2 DEBUG nova.compute.manager [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.109 2 DEBUG oslo_concurrency.lockutils [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.109 2 DEBUG oslo_concurrency.lockutils [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.110 2 DEBUG oslo_concurrency.lockutils [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.110 2 DEBUG nova.compute.manager [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] No waiting events found dispatching network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.110 2 WARNING nova.compute.manager [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received unexpected event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c for instance with vm_state active and task_state migrating.
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.111 2 DEBUG nova.compute.manager [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.111 2 DEBUG oslo_concurrency.lockutils [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.112 2 DEBUG oslo_concurrency.lockutils [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.112 2 DEBUG oslo_concurrency.lockutils [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.112 2 DEBUG nova.compute.manager [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] No waiting events found dispatching network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:14:57 compute-0 nova_compute[192567]: 2025-10-02 08:14:57.113 2 WARNING nova.compute.manager [req-fd2e4202-b210-463b-b15b-33e261bad89b req-5859f5b2-cf30-436a-9143-dc47086a1143 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Received unexpected event network-vif-plugged-b6727183-fb6a-4e44-ab6c-bd72ee94c08c for instance with vm_state active and task_state migrating.
Oct 02 08:14:59 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Oct 02 08:14:59 compute-0 systemd[217361]: Activating special unit Exit the Session...
Oct 02 08:14:59 compute-0 systemd[217361]: Stopped target Main User Target.
Oct 02 08:14:59 compute-0 systemd[217361]: Stopped target Basic System.
Oct 02 08:14:59 compute-0 systemd[217361]: Stopped target Paths.
Oct 02 08:14:59 compute-0 systemd[217361]: Stopped target Sockets.
Oct 02 08:14:59 compute-0 systemd[217361]: Stopped target Timers.
Oct 02 08:14:59 compute-0 systemd[217361]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 02 08:14:59 compute-0 systemd[217361]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 02 08:14:59 compute-0 systemd[217361]: Closed D-Bus User Message Bus Socket.
Oct 02 08:14:59 compute-0 systemd[217361]: Stopped Create User's Volatile Files and Directories.
Oct 02 08:14:59 compute-0 systemd[217361]: Removed slice User Application Slice.
Oct 02 08:14:59 compute-0 systemd[217361]: Reached target Shutdown.
Oct 02 08:14:59 compute-0 systemd[217361]: Finished Exit the Session.
Oct 02 08:14:59 compute-0 systemd[217361]: Reached target Exit the Session.
Oct 02 08:14:59 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Oct 02 08:14:59 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Oct 02 08:14:59 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct 02 08:14:59 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct 02 08:14:59 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct 02 08:14:59 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct 02 08:14:59 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Oct 02 08:14:59 compute-0 podman[203011]: time="2025-10-02T08:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:14:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:14:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2996 "" "Go-http-client/1.1"
Oct 02 08:15:00 compute-0 nova_compute[192567]: 2025-10-02 08:15:00.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:01 compute-0 nova_compute[192567]: 2025-10-02 08:15:01.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:01 compute-0 openstack_network_exporter[205118]: ERROR   08:15:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:15:01 compute-0 openstack_network_exporter[205118]: ERROR   08:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:15:01 compute-0 openstack_network_exporter[205118]: ERROR   08:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:15:01 compute-0 openstack_network_exporter[205118]: ERROR   08:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:15:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:15:01 compute-0 openstack_network_exporter[205118]: ERROR   08:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:15:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:15:01 compute-0 nova_compute[192567]: 2025-10-02 08:15:01.974 2 DEBUG oslo_concurrency.lockutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:15:01 compute-0 nova_compute[192567]: 2025-10-02 08:15:01.975 2 DEBUG oslo_concurrency.lockutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:15:01 compute-0 nova_compute[192567]: 2025-10-02 08:15:01.975 2 DEBUG oslo_concurrency.lockutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "396bfbd7-9258-4c84-9bdc-a0cb3fa92011-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:15:01 compute-0 nova_compute[192567]: 2025-10-02 08:15:01.998 2 DEBUG oslo_concurrency.lockutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:15:01 compute-0 nova_compute[192567]: 2025-10-02 08:15:01.998 2 DEBUG oslo_concurrency.lockutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:15:01 compute-0 nova_compute[192567]: 2025-10-02 08:15:01.999 2 DEBUG oslo_concurrency.lockutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:01.999 2 DEBUG nova.compute.resource_tracker [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.264 2 WARNING nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.265 2 DEBUG nova.compute.resource_tracker [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5872MB free_disk=73.46548461914062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.266 2 DEBUG oslo_concurrency.lockutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.266 2 DEBUG oslo_concurrency.lockutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.321 2 DEBUG nova.compute.resource_tracker [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Migration for instance 396bfbd7-9258-4c84-9bdc-a0cb3fa92011 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.350 2 DEBUG nova.compute.resource_tracker [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.386 2 DEBUG nova.compute.resource_tracker [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Migration 30a29276-62f8-446f-a0a4-6129584becae is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.386 2 DEBUG nova.compute.resource_tracker [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.387 2 DEBUG nova.compute.resource_tracker [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.440 2 DEBUG nova.compute.provider_tree [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.456 2 DEBUG nova.scheduler.client.report [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.479 2 DEBUG nova.compute.resource_tracker [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.480 2 DEBUG oslo_concurrency.lockutils [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.487 2 INFO nova.compute.manager [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.570 2 INFO nova.scheduler.client.report [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Deleted allocation for migration 30a29276-62f8-446f-a0a4-6129584becae
Oct 02 08:15:02 compute-0 nova_compute[192567]: 2025-10-02 08:15:02.571 2 DEBUG nova.virt.libvirt.driver [None req-f5cf91b4-f562-481e-ae43-2829afc142dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Oct 02 08:15:05 compute-0 nova_compute[192567]: 2025-10-02 08:15:05.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:06 compute-0 nova_compute[192567]: 2025-10-02 08:15:06.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:07 compute-0 podman[217514]: 2025-10-02 08:15:07.186109571 +0000 UTC m=+0.086261296 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 02 08:15:09 compute-0 nova_compute[192567]: 2025-10-02 08:15:09.322 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759392894.320203, 396bfbd7-9258-4c84-9bdc-a0cb3fa92011 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:15:09 compute-0 nova_compute[192567]: 2025-10-02 08:15:09.323 2 INFO nova.compute.manager [-] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] VM Stopped (Lifecycle Event)
Oct 02 08:15:09 compute-0 nova_compute[192567]: 2025-10-02 08:15:09.353 2 DEBUG nova.compute.manager [None req-0df6e5ed-094d-420e-97e3-c0e30eb3a5ee - - - - - -] [instance: 396bfbd7-9258-4c84-9bdc-a0cb3fa92011] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:15:10 compute-0 nova_compute[192567]: 2025-10-02 08:15:10.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:11 compute-0 nova_compute[192567]: 2025-10-02 08:15:11.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:12 compute-0 nova_compute[192567]: 2025-10-02 08:15:12.716 2 DEBUG nova.compute.manager [None req-6e4ea787-29dc-4231-ade8-1fd2ee23fd84 06fd0ba32e344f06ac22f27398df6fab a46cbd7217a541c58391886cae342f44 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Oct 02 08:15:12 compute-0 nova_compute[192567]: 2025-10-02 08:15:12.811 2 DEBUG nova.compute.provider_tree [None req-6e4ea787-29dc-4231-ade8-1fd2ee23fd84 06fd0ba32e344f06ac22f27398df6fab a46cbd7217a541c58391886cae342f44 - - default default] Updating resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e generation from 10 to 13 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 02 08:15:15 compute-0 nova_compute[192567]: 2025-10-02 08:15:15.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:16 compute-0 nova_compute[192567]: 2025-10-02 08:15:16.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:17 compute-0 podman[217537]: 2025-10-02 08:15:17.197212818 +0000 UTC m=+0.093005564 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 02 08:15:17 compute-0 podman[217539]: 2025-10-02 08:15:17.229728258 +0000 UTC m=+0.114513243 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Oct 02 08:15:17 compute-0 podman[217538]: 2025-10-02 08:15:17.299306335 +0000 UTC m=+0.188290760 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:15:19 compute-0 podman[217599]: 2025-10-02 08:15:19.183110516 +0000 UTC m=+0.084046857 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 02 08:15:20 compute-0 nova_compute[192567]: 2025-10-02 08:15:20.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:20 compute-0 nova_compute[192567]: 2025-10-02 08:15:20.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:21 compute-0 nova_compute[192567]: 2025-10-02 08:15:21.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:22 compute-0 podman[217619]: 2025-10-02 08:15:22.151830882 +0000 UTC m=+0.062080506 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:15:23 compute-0 nova_compute[192567]: 2025-10-02 08:15:23.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:15:23 compute-0 nova_compute[192567]: 2025-10-02 08:15:23.661 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:15:23 compute-0 nova_compute[192567]: 2025-10-02 08:15:23.661 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:15:23 compute-0 nova_compute[192567]: 2025-10-02 08:15:23.661 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:15:23 compute-0 nova_compute[192567]: 2025-10-02 08:15:23.662 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:15:23 compute-0 nova_compute[192567]: 2025-10-02 08:15:23.903 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:15:23 compute-0 nova_compute[192567]: 2025-10-02 08:15:23.905 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5887MB free_disk=73.46551513671875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:15:23 compute-0 nova_compute[192567]: 2025-10-02 08:15:23.905 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:15:23 compute-0 nova_compute[192567]: 2025-10-02 08:15:23.906 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:15:23 compute-0 nova_compute[192567]: 2025-10-02 08:15:23.998 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:15:23 compute-0 nova_compute[192567]: 2025-10-02 08:15:23.999 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:15:24 compute-0 nova_compute[192567]: 2025-10-02 08:15:24.027 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing inventories for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 02 08:15:24 compute-0 nova_compute[192567]: 2025-10-02 08:15:24.052 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating ProviderTree inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 02 08:15:24 compute-0 nova_compute[192567]: 2025-10-02 08:15:24.053 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:15:24 compute-0 nova_compute[192567]: 2025-10-02 08:15:24.069 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing aggregate associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 02 08:15:24 compute-0 nova_compute[192567]: 2025-10-02 08:15:24.106 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing trait associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 02 08:15:24 compute-0 nova_compute[192567]: 2025-10-02 08:15:24.129 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:15:24 compute-0 nova_compute[192567]: 2025-10-02 08:15:24.144 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:15:24 compute-0 nova_compute[192567]: 2025-10-02 08:15:24.146 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:15:24 compute-0 nova_compute[192567]: 2025-10-02 08:15:24.147 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:15:25 compute-0 nova_compute[192567]: 2025-10-02 08:15:25.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:26 compute-0 nova_compute[192567]: 2025-10-02 08:15:26.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:27 compute-0 nova_compute[192567]: 2025-10-02 08:15:27.142 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:15:27 compute-0 nova_compute[192567]: 2025-10-02 08:15:27.142 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:15:27 compute-0 nova_compute[192567]: 2025-10-02 08:15:27.143 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:15:27 compute-0 nova_compute[192567]: 2025-10-02 08:15:27.143 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:15:27 compute-0 nova_compute[192567]: 2025-10-02 08:15:27.166 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:15:27 compute-0 nova_compute[192567]: 2025-10-02 08:15:27.166 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:15:27 compute-0 nova_compute[192567]: 2025-10-02 08:15:27.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:15:29 compute-0 nova_compute[192567]: 2025-10-02 08:15:29.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:15:29 compute-0 nova_compute[192567]: 2025-10-02 08:15:29.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:15:29 compute-0 nova_compute[192567]: 2025-10-02 08:15:29.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:15:29 compute-0 podman[203011]: time="2025-10-02T08:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:15:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:15:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2997 "" "Go-http-client/1.1"
Oct 02 08:15:30 compute-0 nova_compute[192567]: 2025-10-02 08:15:30.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:31 compute-0 nova_compute[192567]: 2025-10-02 08:15:31.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:31 compute-0 openstack_network_exporter[205118]: ERROR   08:15:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:15:31 compute-0 openstack_network_exporter[205118]: ERROR   08:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:15:31 compute-0 openstack_network_exporter[205118]: ERROR   08:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:15:31 compute-0 openstack_network_exporter[205118]: ERROR   08:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:15:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:15:31 compute-0 openstack_network_exporter[205118]: ERROR   08:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:15:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:15:32 compute-0 nova_compute[192567]: 2025-10-02 08:15:32.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:15:33 compute-0 nova_compute[192567]: 2025-10-02 08:15:33.622 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:15:35 compute-0 nova_compute[192567]: 2025-10-02 08:15:35.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:15:35 compute-0 nova_compute[192567]: 2025-10-02 08:15:35.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:36 compute-0 nova_compute[192567]: 2025-10-02 08:15:36.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:38 compute-0 podman[217643]: 2025-10-02 08:15:38.163917534 +0000 UTC m=+0.075829133 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, distribution-scope=public)
Oct 02 08:15:40 compute-0 nova_compute[192567]: 2025-10-02 08:15:40.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:41 compute-0 nova_compute[192567]: 2025-10-02 08:15:41.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:45 compute-0 nova_compute[192567]: 2025-10-02 08:15:45.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:15:45.973 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:15:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:15:45.974 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:15:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:15:45.974 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:15:46 compute-0 nova_compute[192567]: 2025-10-02 08:15:46.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:48 compute-0 podman[217664]: 2025-10-02 08:15:48.175589288 +0000 UTC m=+0.086729251 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct 02 08:15:48 compute-0 podman[217666]: 2025-10-02 08:15:48.213257926 +0000 UTC m=+0.102561002 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:15:48 compute-0 podman[217665]: 2025-10-02 08:15:48.247645902 +0000 UTC m=+0.145235045 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:15:50 compute-0 podman[217726]: 2025-10-02 08:15:50.185945424 +0000 UTC m=+0.094837913 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid)
Oct 02 08:15:50 compute-0 nova_compute[192567]: 2025-10-02 08:15:50.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:51 compute-0 nova_compute[192567]: 2025-10-02 08:15:51.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:53 compute-0 podman[217750]: 2025-10-02 08:15:53.166469025 +0000 UTC m=+0.074543353 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:15:55 compute-0 ovn_controller[94821]: 2025-10-02T08:15:55Z|00076|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 02 08:15:55 compute-0 nova_compute[192567]: 2025-10-02 08:15:55.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:56 compute-0 nova_compute[192567]: 2025-10-02 08:15:56.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:15:57 compute-0 unix_chkpwd[217774]: password check failed for user (root)
Oct 02 08:15:57 compute-0 sshd-session[217748]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=60.161.14.45  user=root
Oct 02 08:15:59 compute-0 sshd-session[217748]: Failed password for root from 60.161.14.45 port 38028 ssh2
Oct 02 08:15:59 compute-0 podman[203011]: time="2025-10-02T08:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:15:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:15:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2997 "" "Go-http-client/1.1"
Oct 02 08:16:00 compute-0 sshd-session[217748]: Connection closed by authenticating user root 60.161.14.45 port 38028 [preauth]
Oct 02 08:16:00 compute-0 nova_compute[192567]: 2025-10-02 08:16:00.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:01 compute-0 nova_compute[192567]: 2025-10-02 08:16:01.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:01 compute-0 openstack_network_exporter[205118]: ERROR   08:16:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:16:01 compute-0 openstack_network_exporter[205118]: ERROR   08:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:16:01 compute-0 openstack_network_exporter[205118]: ERROR   08:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:16:01 compute-0 openstack_network_exporter[205118]: ERROR   08:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:16:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:16:01 compute-0 openstack_network_exporter[205118]: ERROR   08:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:16:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:16:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:16:03.814 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:16:03 compute-0 nova_compute[192567]: 2025-10-02 08:16:03.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:16:03.816 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:16:05 compute-0 nova_compute[192567]: 2025-10-02 08:16:05.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:06 compute-0 nova_compute[192567]: 2025-10-02 08:16:06.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:16:07.819 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:16:09 compute-0 podman[217776]: 2025-10-02 08:16:09.184811851 +0000 UTC m=+0.091822988 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Oct 02 08:16:10 compute-0 nova_compute[192567]: 2025-10-02 08:16:10.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:11 compute-0 nova_compute[192567]: 2025-10-02 08:16:11.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:15 compute-0 nova_compute[192567]: 2025-10-02 08:16:15.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:16 compute-0 nova_compute[192567]: 2025-10-02 08:16:16.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:19 compute-0 podman[217797]: 2025-10-02 08:16:19.180967136 +0000 UTC m=+0.084822323 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 02 08:16:19 compute-0 podman[217799]: 2025-10-02 08:16:19.197994934 +0000 UTC m=+0.093061738 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct 02 08:16:19 compute-0 podman[217798]: 2025-10-02 08:16:19.246628482 +0000 UTC m=+0.137196006 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 02 08:16:20 compute-0 nova_compute[192567]: 2025-10-02 08:16:20.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:16:20 compute-0 nova_compute[192567]: 2025-10-02 08:16:20.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 02 08:16:20 compute-0 nova_compute[192567]: 2025-10-02 08:16:20.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:21 compute-0 podman[217860]: 2025-10-02 08:16:21.162094335 +0000 UTC m=+0.077526325 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Oct 02 08:16:21 compute-0 nova_compute[192567]: 2025-10-02 08:16:21.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:24 compute-0 podman[217877]: 2025-10-02 08:16:24.205045994 +0000 UTC m=+0.111508350 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:16:24 compute-0 nova_compute[192567]: 2025-10-02 08:16:24.645 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:16:24 compute-0 nova_compute[192567]: 2025-10-02 08:16:24.682 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:16:24 compute-0 nova_compute[192567]: 2025-10-02 08:16:24.683 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:16:24 compute-0 nova_compute[192567]: 2025-10-02 08:16:24.683 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:16:24 compute-0 nova_compute[192567]: 2025-10-02 08:16:24.683 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:16:24 compute-0 nova_compute[192567]: 2025-10-02 08:16:24.890 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:16:24 compute-0 nova_compute[192567]: 2025-10-02 08:16:24.891 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5885MB free_disk=73.46551513671875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:16:24 compute-0 nova_compute[192567]: 2025-10-02 08:16:24.891 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:16:24 compute-0 nova_compute[192567]: 2025-10-02 08:16:24.891 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:16:24 compute-0 nova_compute[192567]: 2025-10-02 08:16:24.953 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:16:24 compute-0 nova_compute[192567]: 2025-10-02 08:16:24.954 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:16:25 compute-0 nova_compute[192567]: 2025-10-02 08:16:25.117 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:16:25 compute-0 nova_compute[192567]: 2025-10-02 08:16:25.132 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:16:25 compute-0 nova_compute[192567]: 2025-10-02 08:16:25.134 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:16:25 compute-0 nova_compute[192567]: 2025-10-02 08:16:25.134 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:16:25 compute-0 nova_compute[192567]: 2025-10-02 08:16:25.135 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:16:25 compute-0 nova_compute[192567]: 2025-10-02 08:16:25.135 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 02 08:16:25 compute-0 nova_compute[192567]: 2025-10-02 08:16:25.151 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 02 08:16:25 compute-0 nova_compute[192567]: 2025-10-02 08:16:25.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:16:25 compute-0 nova_compute[192567]: 2025-10-02 08:16:25.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:26 compute-0 nova_compute[192567]: 2025-10-02 08:16:26.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:26 compute-0 nova_compute[192567]: 2025-10-02 08:16:26.642 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:16:26 compute-0 nova_compute[192567]: 2025-10-02 08:16:26.643 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:16:26 compute-0 nova_compute[192567]: 2025-10-02 08:16:26.643 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:16:26 compute-0 nova_compute[192567]: 2025-10-02 08:16:26.664 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:16:27 compute-0 nova_compute[192567]: 2025-10-02 08:16:27.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:16:28 compute-0 nova_compute[192567]: 2025-10-02 08:16:28.621 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:16:28 compute-0 nova_compute[192567]: 2025-10-02 08:16:28.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:16:29 compute-0 nova_compute[192567]: 2025-10-02 08:16:29.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:16:29 compute-0 podman[203011]: time="2025-10-02T08:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:16:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:16:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2998 "" "Go-http-client/1.1"
Oct 02 08:16:30 compute-0 nova_compute[192567]: 2025-10-02 08:16:30.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:31 compute-0 nova_compute[192567]: 2025-10-02 08:16:31.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:31 compute-0 openstack_network_exporter[205118]: ERROR   08:16:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:16:31 compute-0 openstack_network_exporter[205118]: ERROR   08:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:16:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:16:31 compute-0 openstack_network_exporter[205118]: ERROR   08:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:16:31 compute-0 openstack_network_exporter[205118]: ERROR   08:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:16:31 compute-0 openstack_network_exporter[205118]: ERROR   08:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:16:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:16:31 compute-0 nova_compute[192567]: 2025-10-02 08:16:31.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:16:31 compute-0 nova_compute[192567]: 2025-10-02 08:16:31.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:16:32 compute-0 nova_compute[192567]: 2025-10-02 08:16:32.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:16:35 compute-0 nova_compute[192567]: 2025-10-02 08:16:35.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:36 compute-0 nova_compute[192567]: 2025-10-02 08:16:36.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:37 compute-0 nova_compute[192567]: 2025-10-02 08:16:37.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:16:40 compute-0 podman[217901]: 2025-10-02 08:16:40.163029038 +0000 UTC m=+0.077593368 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, name=ubi9-minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Oct 02 08:16:40 compute-0 nova_compute[192567]: 2025-10-02 08:16:40.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:41 compute-0 nova_compute[192567]: 2025-10-02 08:16:41.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:16:45.974 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:16:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:16:45.974 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:16:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:16:45.974 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:16:45 compute-0 nova_compute[192567]: 2025-10-02 08:16:45.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:46 compute-0 nova_compute[192567]: 2025-10-02 08:16:46.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:50 compute-0 podman[217922]: 2025-10-02 08:16:50.188590063 +0000 UTC m=+0.096899366 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:16:50 compute-0 podman[217924]: 2025-10-02 08:16:50.202849415 +0000 UTC m=+0.100415145 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 02 08:16:50 compute-0 podman[217923]: 2025-10-02 08:16:50.250482202 +0000 UTC m=+0.149957281 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:16:50 compute-0 nova_compute[192567]: 2025-10-02 08:16:50.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:51 compute-0 nova_compute[192567]: 2025-10-02 08:16:51.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:52 compute-0 podman[217986]: 2025-10-02 08:16:52.181622992 +0000 UTC m=+0.090767046 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 02 08:16:55 compute-0 podman[218007]: 2025-10-02 08:16:55.185639954 +0000 UTC m=+0.092113698 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:16:56 compute-0 nova_compute[192567]: 2025-10-02 08:16:56.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:56 compute-0 nova_compute[192567]: 2025-10-02 08:16:56.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:16:59 compute-0 podman[203011]: time="2025-10-02T08:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:16:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:16:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2995 "" "Go-http-client/1.1"
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.214 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.215 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.241 2 DEBUG nova.compute.manager [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.331 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.332 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.340 2 DEBUG nova.virt.hardware [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.341 2 INFO nova.compute.claims [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.565 2 DEBUG nova.compute.provider_tree [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.588 2 DEBUG nova.scheduler.client.report [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.633 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.635 2 DEBUG nova.compute.manager [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.691 2 DEBUG nova.compute.manager [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.692 2 DEBUG nova.network.neutron [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.707 2 INFO nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.721 2 DEBUG nova.compute.manager [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.815 2 DEBUG nova.compute.manager [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.817 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.817 2 INFO nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Creating image(s)
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.818 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "/var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.818 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "/var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.819 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "/var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.832 2 DEBUG oslo_concurrency.processutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.911 2 DEBUG oslo_concurrency.processutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.912 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.913 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.923 2 DEBUG oslo_concurrency.processutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.978 2 DEBUG oslo_concurrency.processutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:00 compute-0 nova_compute[192567]: 2025-10-02 08:17:00.979 2 DEBUG oslo_concurrency.processutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.013 2 DEBUG oslo_concurrency.processutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.014 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.015 2 DEBUG oslo_concurrency.processutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.075 2 DEBUG oslo_concurrency.processutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.076 2 DEBUG nova.virt.disk.api [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Checking if we can resize image /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.077 2 DEBUG oslo_concurrency.processutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.153 2 DEBUG oslo_concurrency.processutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.154 2 DEBUG nova.virt.disk.api [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Cannot resize image /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.154 2 DEBUG nova.objects.instance [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lazy-loading 'migration_context' on Instance uuid 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.166 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.167 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Ensure instance console log exists: /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.167 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.168 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.168 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:01 compute-0 nova_compute[192567]: 2025-10-02 08:17:01.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:01 compute-0 openstack_network_exporter[205118]: ERROR   08:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:17:01 compute-0 openstack_network_exporter[205118]: ERROR   08:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:17:01 compute-0 openstack_network_exporter[205118]: ERROR   08:17:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:17:01 compute-0 openstack_network_exporter[205118]: ERROR   08:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:17:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:17:01 compute-0 openstack_network_exporter[205118]: ERROR   08:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:17:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:17:03 compute-0 nova_compute[192567]: 2025-10-02 08:17:03.937 2 DEBUG nova.network.neutron [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Successfully created port: 0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 02 08:17:05 compute-0 nova_compute[192567]: 2025-10-02 08:17:05.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:05 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:05.120 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:17:05 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:05.122 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:17:06 compute-0 nova_compute[192567]: 2025-10-02 08:17:06.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:06 compute-0 nova_compute[192567]: 2025-10-02 08:17:06.051 2 DEBUG nova.network.neutron [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Successfully updated port: 0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 02 08:17:06 compute-0 nova_compute[192567]: 2025-10-02 08:17:06.067 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "refresh_cache-2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:17:06 compute-0 nova_compute[192567]: 2025-10-02 08:17:06.068 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquired lock "refresh_cache-2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:17:06 compute-0 nova_compute[192567]: 2025-10-02 08:17:06.068 2 DEBUG nova.network.neutron [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:17:06 compute-0 nova_compute[192567]: 2025-10-02 08:17:06.150 2 DEBUG nova.compute.manager [req-a9278e36-f6ed-4830-bb2d-c38c26a0c09b req-8ddfb043-4c2c-4e70-a951-d859f423c3a1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Received event network-changed-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:17:06 compute-0 nova_compute[192567]: 2025-10-02 08:17:06.151 2 DEBUG nova.compute.manager [req-a9278e36-f6ed-4830-bb2d-c38c26a0c09b req-8ddfb043-4c2c-4e70-a951-d859f423c3a1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Refreshing instance network info cache due to event network-changed-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:17:06 compute-0 nova_compute[192567]: 2025-10-02 08:17:06.151 2 DEBUG oslo_concurrency.lockutils [req-a9278e36-f6ed-4830-bb2d-c38c26a0c09b req-8ddfb043-4c2c-4e70-a951-d859f423c3a1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:17:06 compute-0 nova_compute[192567]: 2025-10-02 08:17:06.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:06 compute-0 nova_compute[192567]: 2025-10-02 08:17:06.816 2 DEBUG nova.network.neutron [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.175 2 DEBUG nova.network.neutron [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Updating instance_info_cache with network_info: [{"id": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "address": "fa:16:3e:d6:a6:f2", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e5ee8e9-5d", "ovs_interfaceid": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.213 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Releasing lock "refresh_cache-2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.214 2 DEBUG nova.compute.manager [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Instance network_info: |[{"id": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "address": "fa:16:3e:d6:a6:f2", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e5ee8e9-5d", "ovs_interfaceid": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.215 2 DEBUG oslo_concurrency.lockutils [req-a9278e36-f6ed-4830-bb2d-c38c26a0c09b req-8ddfb043-4c2c-4e70-a951-d859f423c3a1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.216 2 DEBUG nova.network.neutron [req-a9278e36-f6ed-4830-bb2d-c38c26a0c09b req-8ddfb043-4c2c-4e70-a951-d859f423c3a1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Refreshing network info cache for port 0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.221 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Start _get_guest_xml network_info=[{"id": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "address": "fa:16:3e:d6:a6:f2", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e5ee8e9-5d", "ovs_interfaceid": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.229 2 WARNING nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.243 2 DEBUG nova.virt.libvirt.host [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.244 2 DEBUG nova.virt.libvirt.host [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.248 2 DEBUG nova.virt.libvirt.host [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.249 2 DEBUG nova.virt.libvirt.host [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.250 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.251 2 DEBUG nova.virt.hardware [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.251 2 DEBUG nova.virt.hardware [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.252 2 DEBUG nova.virt.hardware [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.252 2 DEBUG nova.virt.hardware [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.253 2 DEBUG nova.virt.hardware [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.253 2 DEBUG nova.virt.hardware [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.254 2 DEBUG nova.virt.hardware [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.254 2 DEBUG nova.virt.hardware [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.255 2 DEBUG nova.virt.hardware [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.255 2 DEBUG nova.virt.hardware [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.255 2 DEBUG nova.virt.hardware [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.262 2 DEBUG nova.virt.libvirt.vif [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:16:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1954629454',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1954629454',id=10,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ac58297e5b44744976c58f773f94090',ramdisk_id='',reservation_id='r-k8hhkw81',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:17:00Z,user_data=None,user_id='5455cae7258940a8926bef2dc2483570',uuid=2d4e4e51-5053-4e7b-896e-526ac0cdc1a6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "address": "fa:16:3e:d6:a6:f2", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e5ee8e9-5d", "ovs_interfaceid": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.262 2 DEBUG nova.network.os_vif_util [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converting VIF {"id": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "address": "fa:16:3e:d6:a6:f2", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e5ee8e9-5d", "ovs_interfaceid": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.264 2 DEBUG nova.network.os_vif_util [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:a6:f2,bridge_name='br-int',has_traffic_filtering=True,id=0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e5ee8e9-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.266 2 DEBUG nova.objects.instance [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.281 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:17:08 compute-0 nova_compute[192567]:   <uuid>2d4e4e51-5053-4e7b-896e-526ac0cdc1a6</uuid>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   <name>instance-0000000a</name>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1954629454</nova:name>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:17:08</nova:creationTime>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:17:08 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:17:08 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:17:08 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:17:08 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:17:08 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:17:08 compute-0 nova_compute[192567]:         <nova:user uuid="5455cae7258940a8926bef2dc2483570">tempest-TestExecuteHostMaintenanceStrategy-1763362073-project-admin</nova:user>
Oct 02 08:17:08 compute-0 nova_compute[192567]:         <nova:project uuid="7ac58297e5b44744976c58f773f94090">tempest-TestExecuteHostMaintenanceStrategy-1763362073</nova:project>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:17:08 compute-0 nova_compute[192567]:         <nova:port uuid="0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb">
Oct 02 08:17:08 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <system>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <entry name="serial">2d4e4e51-5053-4e7b-896e-526ac0cdc1a6</entry>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <entry name="uuid">2d4e4e51-5053-4e7b-896e-526ac0cdc1a6</entry>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     </system>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   <os>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   </os>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   <features>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   </features>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk.config"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:d6:a6:f2"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <target dev="tap0e5ee8e9-5d"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/console.log" append="off"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <video>
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     </video>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:17:08 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:17:08 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:17:08 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:17:08 compute-0 nova_compute[192567]: </domain>
Oct 02 08:17:08 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.282 2 DEBUG nova.compute.manager [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Preparing to wait for external event network-vif-plugged-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.283 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.284 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.285 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.287 2 DEBUG nova.virt.libvirt.vif [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:16:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1954629454',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1954629454',id=10,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ac58297e5b44744976c58f773f94090',ramdisk_id='',reservation_id='r-k8hhkw81',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:17:00Z,user_data=None,user_id='5455cae7258940a8926bef2dc2483570',uuid=2d4e4e51-5053-4e7b-896e-526ac0cdc1a6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "address": "fa:16:3e:d6:a6:f2", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e5ee8e9-5d", "ovs_interfaceid": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.287 2 DEBUG nova.network.os_vif_util [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converting VIF {"id": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "address": "fa:16:3e:d6:a6:f2", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e5ee8e9-5d", "ovs_interfaceid": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.289 2 DEBUG nova.network.os_vif_util [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:a6:f2,bridge_name='br-int',has_traffic_filtering=True,id=0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e5ee8e9-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.290 2 DEBUG os_vif [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:a6:f2,bridge_name='br-int',has_traffic_filtering=True,id=0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e5ee8e9-5d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.293 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.294 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.300 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e5ee8e9-5d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.301 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0e5ee8e9-5d, col_values=(('external_ids', {'iface-id': '0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:a6:f2', 'vm-uuid': '2d4e4e51-5053-4e7b-896e-526ac0cdc1a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:08 compute-0 NetworkManager[51654]: <info>  [1759393028.3539] manager: (tap0e5ee8e9-5d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.365 2 INFO os_vif [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:a6:f2,bridge_name='br-int',has_traffic_filtering=True,id=0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e5ee8e9-5d')
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.416 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.417 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.418 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] No VIF found with MAC fa:16:3e:d6:a6:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:17:08 compute-0 nova_compute[192567]: 2025-10-02 08:17:08.419 2 INFO nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Using config drive
Oct 02 08:17:09 compute-0 nova_compute[192567]: 2025-10-02 08:17:09.027 2 INFO nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Creating config drive at /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk.config
Oct 02 08:17:09 compute-0 nova_compute[192567]: 2025-10-02 08:17:09.037 2 DEBUG oslo_concurrency.processutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkauoh_os execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:09 compute-0 nova_compute[192567]: 2025-10-02 08:17:09.187 2 DEBUG oslo_concurrency.processutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkauoh_os" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:09 compute-0 kernel: tap0e5ee8e9-5d: entered promiscuous mode
Oct 02 08:17:09 compute-0 NetworkManager[51654]: <info>  [1759393029.2922] manager: (tap0e5ee8e9-5d): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Oct 02 08:17:09 compute-0 ovn_controller[94821]: 2025-10-02T08:17:09Z|00077|binding|INFO|Claiming lport 0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb for this chassis.
Oct 02 08:17:09 compute-0 ovn_controller[94821]: 2025-10-02T08:17:09Z|00078|binding|INFO|0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb: Claiming fa:16:3e:d6:a6:f2 10.100.0.9
Oct 02 08:17:09 compute-0 nova_compute[192567]: 2025-10-02 08:17:09.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:09 compute-0 nova_compute[192567]: 2025-10-02 08:17:09.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.316 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:a6:f2 10.100.0.9'], port_security=['fa:16:3e:d6:a6:f2 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2d4e4e51-5053-4e7b-896e-526ac0cdc1a6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ac58297e5b44744976c58f773f94090', 'neutron:revision_number': '2', 'neutron:security_group_ids': '92c02662-21d7-4fe7-9c02-e6a0bb798f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=176593bb-df9e-44fd-86b3-56aea7ef157a, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.318 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb in datapath d2dffba9-387a-40b6-bcfb-049fd17ed68f bound to our chassis
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.320 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d2dffba9-387a-40b6-bcfb-049fd17ed68f
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.341 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[04c80f40-47e8-43ba-bf04-43dcad8ffbe0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.344 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd2dffba9-31 in ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.346 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd2dffba9-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.346 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[b807c62d-b089-46b7-af49-61c4868e32eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 systemd-udevd[218067]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.347 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[94014117-6012-4d60-8128-76ee61bbea74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 systemd-machined[152597]: New machine qemu-7-instance-0000000a.
Oct 02 08:17:09 compute-0 NetworkManager[51654]: <info>  [1759393029.3730] device (tap0e5ee8e9-5d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:17:09 compute-0 NetworkManager[51654]: <info>  [1759393029.3743] device (tap0e5ee8e9-5d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.379 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[372e58a2-be6d-4e74-9d20-8195f7cb6d0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 ovn_controller[94821]: 2025-10-02T08:17:09Z|00079|binding|INFO|Setting lport 0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb ovn-installed in OVS
Oct 02 08:17:09 compute-0 ovn_controller[94821]: 2025-10-02T08:17:09Z|00080|binding|INFO|Setting lport 0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb up in Southbound
Oct 02 08:17:09 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000a.
Oct 02 08:17:09 compute-0 nova_compute[192567]: 2025-10-02 08:17:09.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.436 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[cd859b6d-fbd8-474b-b75d-a5903e5a4082]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.496 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[758587bf-1da0-4935-8331-59f54ea7490a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 systemd-udevd[218070]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:17:09 compute-0 NetworkManager[51654]: <info>  [1759393029.5056] manager: (tapd2dffba9-30): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.504 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[fe0561c6-28f8-4a03-9e72-59557ba76eaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.562 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec2403e-ef8e-40c0-93eb-97ef4026aade]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.566 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[3b4d2b64-8dda-4725-8855-d1d5725df8ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 NetworkManager[51654]: <info>  [1759393029.5976] device (tapd2dffba9-30): carrier: link connected
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.609 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[28b9246c-b435-497d-82a1-5858df417123]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.640 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[bfc3b3b2-b91c-4456-ac33-686472e1e62e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2dffba9-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:a1:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391718, 'reachable_time': 38185, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218099, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.670 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[a02a6324-7993-487b-bf1c-e55f2fa1f9ec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe06:a1f9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391718, 'tstamp': 391718}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218100, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.705 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[53507fcc-a926-40de-8122-44218d158558]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2dffba9-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:a1:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391718, 'reachable_time': 38185, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218101, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.758 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[4d3ce39d-1830-4f96-9432-c396e608e6a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.868 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[3e8e68a1-0d6e-4803-ac72-7967de86acfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.871 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2dffba9-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.871 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.872 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2dffba9-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:09 compute-0 nova_compute[192567]: 2025-10-02 08:17:09.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:09 compute-0 NetworkManager[51654]: <info>  [1759393029.8766] manager: (tapd2dffba9-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Oct 02 08:17:09 compute-0 kernel: tapd2dffba9-30: entered promiscuous mode
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.880 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd2dffba9-30, col_values=(('external_ids', {'iface-id': 'e3ee8aeb-cc58-469b-9f75-ef53474d1d07'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:09 compute-0 ovn_controller[94821]: 2025-10-02T08:17:09Z|00081|binding|INFO|Releasing lport e3ee8aeb-cc58-469b-9f75-ef53474d1d07 from this chassis (sb_readonly=0)
Oct 02 08:17:09 compute-0 nova_compute[192567]: 2025-10-02 08:17:09.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.909 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d2dffba9-387a-40b6-bcfb-049fd17ed68f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d2dffba9-387a-40b6-bcfb-049fd17ed68f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:17:09 compute-0 nova_compute[192567]: 2025-10-02 08:17:09.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.911 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[fc001959-41cd-499e-af8b-c528b3007b7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.912 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-d2dffba9-387a-40b6-bcfb-049fd17ed68f
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/d2dffba9-387a-40b6-bcfb-049fd17ed68f.pid.haproxy
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID d2dffba9-387a-40b6-bcfb-049fd17ed68f
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:17:09 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:09.914 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'env', 'PROCESS_TAG=haproxy-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d2dffba9-387a-40b6-bcfb-049fd17ed68f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.084 2 DEBUG nova.compute.manager [req-9274ba18-bfb8-4e7e-9534-112d2dc0fe48 req-b763a2e7-9e27-4a85-9c45-8e706c6b4222 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Received event network-vif-plugged-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.086 2 DEBUG oslo_concurrency.lockutils [req-9274ba18-bfb8-4e7e-9534-112d2dc0fe48 req-b763a2e7-9e27-4a85-9c45-8e706c6b4222 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.086 2 DEBUG oslo_concurrency.lockutils [req-9274ba18-bfb8-4e7e-9534-112d2dc0fe48 req-b763a2e7-9e27-4a85-9c45-8e706c6b4222 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.087 2 DEBUG oslo_concurrency.lockutils [req-9274ba18-bfb8-4e7e-9534-112d2dc0fe48 req-b763a2e7-9e27-4a85-9c45-8e706c6b4222 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.087 2 DEBUG nova.compute.manager [req-9274ba18-bfb8-4e7e-9534-112d2dc0fe48 req-b763a2e7-9e27-4a85-9c45-8e706c6b4222 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Processing event network-vif-plugged-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.330 2 DEBUG nova.network.neutron [req-a9278e36-f6ed-4830-bb2d-c38c26a0c09b req-8ddfb043-4c2c-4e70-a951-d859f423c3a1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Updated VIF entry in instance network info cache for port 0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.331 2 DEBUG nova.network.neutron [req-a9278e36-f6ed-4830-bb2d-c38c26a0c09b req-8ddfb043-4c2c-4e70-a951-d859f423c3a1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Updating instance_info_cache with network_info: [{"id": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "address": "fa:16:3e:d6:a6:f2", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e5ee8e9-5d", "ovs_interfaceid": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.354 2 DEBUG oslo_concurrency.lockutils [req-a9278e36-f6ed-4830-bb2d-c38c26a0c09b req-8ddfb043-4c2c-4e70-a951-d859f423c3a1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:17:10 compute-0 podman[218140]: 2025-10-02 08:17:10.404428958 +0000 UTC m=+0.084524820 container create 95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 02 08:17:10 compute-0 podman[218140]: 2025-10-02 08:17:10.363623048 +0000 UTC m=+0.043719020 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:17:10 compute-0 systemd[1]: Started libpod-conmon-95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628.scope.
Oct 02 08:17:10 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:17:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf73597b1147e06c8e41a4ed7b7cfad4da6d9d7d21efdb54a2e3086b3c3ee7ce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:17:10 compute-0 podman[218140]: 2025-10-02 08:17:10.506436982 +0000 UTC m=+0.186532854 container init 95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 02 08:17:10 compute-0 podman[218140]: 2025-10-02 08:17:10.512723608 +0000 UTC m=+0.192819460 container start 95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.519 2 DEBUG nova.compute.manager [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.521 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393030.519003, 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.521 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] VM Started (Lifecycle Event)
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.526 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.532 2 INFO nova.virt.libvirt.driver [-] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Instance spawned successfully.
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.533 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 02 08:17:10 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218162]: [NOTICE]   (218174) : New worker (218180) forked
Oct 02 08:17:10 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218162]: [NOTICE]   (218174) : Loading success.
Oct 02 08:17:10 compute-0 podman[218154]: 2025-10-02 08:17:10.552717342 +0000 UTC m=+0.102038285 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.558 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.564 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.568 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.569 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.569 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.569 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.570 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.570 2 DEBUG nova.virt.libvirt.driver [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.623 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.624 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393030.5214026, 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.624 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] VM Paused (Lifecycle Event)
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.656 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.662 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393030.5254257, 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.662 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] VM Resumed (Lifecycle Event)
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.668 2 INFO nova.compute.manager [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Took 9.85 seconds to spawn the instance on the hypervisor.
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.668 2 DEBUG nova.compute.manager [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.679 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.683 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.706 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.735 2 INFO nova.compute.manager [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Took 10.45 seconds to build instance.
Oct 02 08:17:10 compute-0 nova_compute[192567]: 2025-10-02 08:17:10.750 2 DEBUG oslo_concurrency.lockutils [None req-21d4fc17-fcfd-466c-a962-0292ee13de02 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:11 compute-0 nova_compute[192567]: 2025-10-02 08:17:11.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:12 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:12.125 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:12 compute-0 nova_compute[192567]: 2025-10-02 08:17:12.216 2 DEBUG nova.compute.manager [req-8cbc0f62-7977-465d-b0a0-d8360b5c667b req-7955d2fa-1b5c-4839-9906-9d3a50920733 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Received event network-vif-plugged-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:17:12 compute-0 nova_compute[192567]: 2025-10-02 08:17:12.217 2 DEBUG oslo_concurrency.lockutils [req-8cbc0f62-7977-465d-b0a0-d8360b5c667b req-7955d2fa-1b5c-4839-9906-9d3a50920733 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:12 compute-0 nova_compute[192567]: 2025-10-02 08:17:12.218 2 DEBUG oslo_concurrency.lockutils [req-8cbc0f62-7977-465d-b0a0-d8360b5c667b req-7955d2fa-1b5c-4839-9906-9d3a50920733 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:12 compute-0 nova_compute[192567]: 2025-10-02 08:17:12.218 2 DEBUG oslo_concurrency.lockutils [req-8cbc0f62-7977-465d-b0a0-d8360b5c667b req-7955d2fa-1b5c-4839-9906-9d3a50920733 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:12 compute-0 nova_compute[192567]: 2025-10-02 08:17:12.219 2 DEBUG nova.compute.manager [req-8cbc0f62-7977-465d-b0a0-d8360b5c667b req-7955d2fa-1b5c-4839-9906-9d3a50920733 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] No waiting events found dispatching network-vif-plugged-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:17:12 compute-0 nova_compute[192567]: 2025-10-02 08:17:12.219 2 WARNING nova.compute.manager [req-8cbc0f62-7977-465d-b0a0-d8360b5c667b req-7955d2fa-1b5c-4839-9906-9d3a50920733 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Received unexpected event network-vif-plugged-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb for instance with vm_state active and task_state None.
Oct 02 08:17:13 compute-0 nova_compute[192567]: 2025-10-02 08:17:13.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:16 compute-0 nova_compute[192567]: 2025-10-02 08:17:16.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:18 compute-0 nova_compute[192567]: 2025-10-02 08:17:18.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:21 compute-0 nova_compute[192567]: 2025-10-02 08:17:21.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:21 compute-0 ovn_controller[94821]: 2025-10-02T08:17:21Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d6:a6:f2 10.100.0.9
Oct 02 08:17:21 compute-0 ovn_controller[94821]: 2025-10-02T08:17:21Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d6:a6:f2 10.100.0.9
Oct 02 08:17:21 compute-0 podman[218202]: 2025-10-02 08:17:21.202257395 +0000 UTC m=+0.093983315 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:17:21 compute-0 podman[218200]: 2025-10-02 08:17:21.214059692 +0000 UTC m=+0.115159754 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 02 08:17:21 compute-0 podman[218201]: 2025-10-02 08:17:21.240221136 +0000 UTC m=+0.134383033 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 02 08:17:23 compute-0 podman[218265]: 2025-10-02 08:17:23.218732369 +0000 UTC m=+0.127499218 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct 02 08:17:23 compute-0 nova_compute[192567]: 2025-10-02 08:17:23.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:24 compute-0 nova_compute[192567]: 2025-10-02 08:17:24.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:17:24 compute-0 nova_compute[192567]: 2025-10-02 08:17:24.653 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:24 compute-0 nova_compute[192567]: 2025-10-02 08:17:24.655 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:24 compute-0 nova_compute[192567]: 2025-10-02 08:17:24.656 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:24 compute-0 nova_compute[192567]: 2025-10-02 08:17:24.657 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:17:24 compute-0 nova_compute[192567]: 2025-10-02 08:17:24.754 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:24 compute-0 nova_compute[192567]: 2025-10-02 08:17:24.853 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:24 compute-0 nova_compute[192567]: 2025-10-02 08:17:24.855 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:24 compute-0 nova_compute[192567]: 2025-10-02 08:17:24.928 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:25 compute-0 nova_compute[192567]: 2025-10-02 08:17:25.124 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:17:25 compute-0 nova_compute[192567]: 2025-10-02 08:17:25.126 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5695MB free_disk=73.4365005493164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:17:25 compute-0 nova_compute[192567]: 2025-10-02 08:17:25.126 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:25 compute-0 nova_compute[192567]: 2025-10-02 08:17:25.126 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:25 compute-0 nova_compute[192567]: 2025-10-02 08:17:25.208 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:17:25 compute-0 nova_compute[192567]: 2025-10-02 08:17:25.208 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:17:25 compute-0 nova_compute[192567]: 2025-10-02 08:17:25.209 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:17:25 compute-0 nova_compute[192567]: 2025-10-02 08:17:25.290 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:17:25 compute-0 nova_compute[192567]: 2025-10-02 08:17:25.304 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:17:25 compute-0 nova_compute[192567]: 2025-10-02 08:17:25.342 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:17:25 compute-0 nova_compute[192567]: 2025-10-02 08:17:25.342 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:26 compute-0 nova_compute[192567]: 2025-10-02 08:17:26.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:26 compute-0 podman[218292]: 2025-10-02 08:17:26.177013183 +0000 UTC m=+0.082648202 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:17:27 compute-0 nova_compute[192567]: 2025-10-02 08:17:27.344 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:17:27 compute-0 nova_compute[192567]: 2025-10-02 08:17:27.345 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:17:27 compute-0 nova_compute[192567]: 2025-10-02 08:17:27.346 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:17:27 compute-0 nova_compute[192567]: 2025-10-02 08:17:27.726 2 DEBUG nova.virt.libvirt.driver [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Creating tmpfile /var/lib/nova/instances/tmpkr55vccv to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Oct 02 08:17:27 compute-0 nova_compute[192567]: 2025-10-02 08:17:27.728 2 DEBUG nova.compute.manager [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkr55vccv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Oct 02 08:17:27 compute-0 nova_compute[192567]: 2025-10-02 08:17:27.825 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:17:27 compute-0 nova_compute[192567]: 2025-10-02 08:17:27.826 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:17:27 compute-0 nova_compute[192567]: 2025-10-02 08:17:27.826 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:17:27 compute-0 nova_compute[192567]: 2025-10-02 08:17:27.826 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:17:28 compute-0 nova_compute[192567]: 2025-10-02 08:17:28.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:28 compute-0 nova_compute[192567]: 2025-10-02 08:17:28.849 2 DEBUG nova.compute.manager [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkr55vccv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d8d2f2de-031b-4ba3-8896-64dd2578899e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Oct 02 08:17:28 compute-0 nova_compute[192567]: 2025-10-02 08:17:28.885 2 DEBUG oslo_concurrency.lockutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-d8d2f2de-031b-4ba3-8896-64dd2578899e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:17:28 compute-0 nova_compute[192567]: 2025-10-02 08:17:28.885 2 DEBUG oslo_concurrency.lockutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-d8d2f2de-031b-4ba3-8896-64dd2578899e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:17:28 compute-0 nova_compute[192567]: 2025-10-02 08:17:28.886 2 DEBUG nova.network.neutron [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:17:29 compute-0 nova_compute[192567]: 2025-10-02 08:17:29.557 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Updating instance_info_cache with network_info: [{"id": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "address": "fa:16:3e:d6:a6:f2", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e5ee8e9-5d", "ovs_interfaceid": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:17:29 compute-0 nova_compute[192567]: 2025-10-02 08:17:29.581 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:17:29 compute-0 nova_compute[192567]: 2025-10-02 08:17:29.582 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:17:29 compute-0 nova_compute[192567]: 2025-10-02 08:17:29.582 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:17:29 compute-0 nova_compute[192567]: 2025-10-02 08:17:29.583 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:17:29 compute-0 podman[203011]: time="2025-10-02T08:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:17:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:17:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3456 "" "Go-http-client/1.1"
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.204 2 DEBUG nova.network.neutron [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Updating instance_info_cache with network_info: [{"id": "c1d0d784-50e4-485f-b045-97036b41371f", "address": "fa:16:3e:fc:5e:54", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d0d784-50", "ovs_interfaceid": "c1d0d784-50e4-485f-b045-97036b41371f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.227 2 DEBUG oslo_concurrency.lockutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-d8d2f2de-031b-4ba3-8896-64dd2578899e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.230 2 DEBUG nova.virt.libvirt.driver [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkr55vccv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d8d2f2de-031b-4ba3-8896-64dd2578899e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.230 2 DEBUG nova.virt.libvirt.driver [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Creating instance directory: /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.231 2 DEBUG nova.virt.libvirt.driver [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Creating disk.info with the contents: {'/var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk': 'qcow2', '/var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.231 2 DEBUG nova.virt.libvirt.driver [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.232 2 DEBUG nova.objects.instance [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid d8d2f2de-031b-4ba3-8896-64dd2578899e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.276 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.344 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.345 2 DEBUG oslo_concurrency.lockutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.347 2 DEBUG oslo_concurrency.lockutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.377 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.440 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.442 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.501 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.503 2 DEBUG oslo_concurrency.lockutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.503 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.602 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.604 2 DEBUG nova.virt.disk.api [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.605 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.668 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.670 2 DEBUG nova.virt.disk.api [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.671 2 DEBUG nova.objects.instance [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid d8d2f2de-031b-4ba3-8896-64dd2578899e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.688 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.731 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk.config 485376" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.733 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk.config to /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Oct 02 08:17:30 compute-0 nova_compute[192567]: 2025-10-02 08:17:30.734 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk.config /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.239 2 DEBUG oslo_concurrency.processutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e/disk.config /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.240 2 DEBUG nova.virt.libvirt.driver [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.242 2 DEBUG nova.virt.libvirt.vif [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:16:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1734897634',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1734897634',id=9,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:16:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7ac58297e5b44744976c58f773f94090',ramdisk_id='',reservation_id='r-0rb6n7qe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:16:54Z,user_data=None,user_id='5455cae7258940a8926bef2dc2483570',uuid=d8d2f2de-031b-4ba3-8896-64dd2578899e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c1d0d784-50e4-485f-b045-97036b41371f", "address": "fa:16:3e:fc:5e:54", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc1d0d784-50", "ovs_interfaceid": "c1d0d784-50e4-485f-b045-97036b41371f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.242 2 DEBUG nova.network.os_vif_util [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "c1d0d784-50e4-485f-b045-97036b41371f", "address": "fa:16:3e:fc:5e:54", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc1d0d784-50", "ovs_interfaceid": "c1d0d784-50e4-485f-b045-97036b41371f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.244 2 DEBUG nova.network.os_vif_util [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=c1d0d784-50e4-485f-b045-97036b41371f,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d0d784-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.244 2 DEBUG os_vif [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=c1d0d784-50e4-485f-b045-97036b41371f,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d0d784-50') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.246 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.246 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.252 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1d0d784-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.253 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc1d0d784-50, col_values=(('external_ids', {'iface-id': 'c1d0d784-50e4-485f-b045-97036b41371f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fc:5e:54', 'vm-uuid': 'd8d2f2de-031b-4ba3-8896-64dd2578899e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:31 compute-0 NetworkManager[51654]: <info>  [1759393051.2565] manager: (tapc1d0d784-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.270 2 INFO os_vif [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=c1d0d784-50e4-485f-b045-97036b41371f,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d0d784-50')
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.271 2 DEBUG nova.virt.libvirt.driver [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.271 2 DEBUG nova.compute.manager [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkr55vccv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d8d2f2de-031b-4ba3-8896-64dd2578899e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Oct 02 08:17:31 compute-0 openstack_network_exporter[205118]: ERROR   08:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:17:31 compute-0 openstack_network_exporter[205118]: ERROR   08:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:17:31 compute-0 openstack_network_exporter[205118]: ERROR   08:17:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:17:31 compute-0 openstack_network_exporter[205118]: ERROR   08:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:17:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:17:31 compute-0 openstack_network_exporter[205118]: ERROR   08:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:17:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:17:31 compute-0 nova_compute[192567]: 2025-10-02 08:17:31.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:17:33 compute-0 nova_compute[192567]: 2025-10-02 08:17:33.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:17:33 compute-0 nova_compute[192567]: 2025-10-02 08:17:33.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:17:33 compute-0 nova_compute[192567]: 2025-10-02 08:17:33.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:17:33 compute-0 nova_compute[192567]: 2025-10-02 08:17:33.908 2 DEBUG nova.network.neutron [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Port c1d0d784-50e4-485f-b045-97036b41371f updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Oct 02 08:17:33 compute-0 nova_compute[192567]: 2025-10-02 08:17:33.911 2 DEBUG nova.compute.manager [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkr55vccv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d8d2f2de-031b-4ba3-8896-64dd2578899e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Oct 02 08:17:34 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 02 08:17:34 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 02 08:17:34 compute-0 kernel: tapc1d0d784-50: entered promiscuous mode
Oct 02 08:17:34 compute-0 NetworkManager[51654]: <info>  [1759393054.2958] manager: (tapc1d0d784-50): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Oct 02 08:17:34 compute-0 nova_compute[192567]: 2025-10-02 08:17:34.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:34 compute-0 ovn_controller[94821]: 2025-10-02T08:17:34Z|00082|binding|INFO|Claiming lport c1d0d784-50e4-485f-b045-97036b41371f for this additional chassis.
Oct 02 08:17:34 compute-0 ovn_controller[94821]: 2025-10-02T08:17:34Z|00083|binding|INFO|c1d0d784-50e4-485f-b045-97036b41371f: Claiming fa:16:3e:fc:5e:54 10.100.0.10
Oct 02 08:17:34 compute-0 ovn_controller[94821]: 2025-10-02T08:17:34Z|00084|binding|INFO|Setting lport c1d0d784-50e4-485f-b045-97036b41371f ovn-installed in OVS
Oct 02 08:17:34 compute-0 nova_compute[192567]: 2025-10-02 08:17:34.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:34 compute-0 nova_compute[192567]: 2025-10-02 08:17:34.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:34 compute-0 systemd-udevd[218369]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:17:34 compute-0 systemd-machined[152597]: New machine qemu-8-instance-00000009.
Oct 02 08:17:34 compute-0 NetworkManager[51654]: <info>  [1759393054.3722] device (tapc1d0d784-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:17:34 compute-0 NetworkManager[51654]: <info>  [1759393054.3742] device (tapc1d0d784-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:17:34 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000009.
Oct 02 08:17:35 compute-0 nova_compute[192567]: 2025-10-02 08:17:35.621 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:17:36 compute-0 nova_compute[192567]: 2025-10-02 08:17:36.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:36 compute-0 nova_compute[192567]: 2025-10-02 08:17:36.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:36 compute-0 nova_compute[192567]: 2025-10-02 08:17:36.408 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393056.407858, d8d2f2de-031b-4ba3-8896-64dd2578899e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:17:36 compute-0 nova_compute[192567]: 2025-10-02 08:17:36.409 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] VM Started (Lifecycle Event)
Oct 02 08:17:36 compute-0 nova_compute[192567]: 2025-10-02 08:17:36.431 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:17:37 compute-0 nova_compute[192567]: 2025-10-02 08:17:37.172 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393057.1715372, d8d2f2de-031b-4ba3-8896-64dd2578899e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:17:37 compute-0 nova_compute[192567]: 2025-10-02 08:17:37.173 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] VM Resumed (Lifecycle Event)
Oct 02 08:17:37 compute-0 nova_compute[192567]: 2025-10-02 08:17:37.197 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:17:37 compute-0 nova_compute[192567]: 2025-10-02 08:17:37.201 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:17:37 compute-0 nova_compute[192567]: 2025-10-02 08:17:37.223 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Oct 02 08:17:38 compute-0 nova_compute[192567]: 2025-10-02 08:17:38.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:17:38 compute-0 ovn_controller[94821]: 2025-10-02T08:17:38Z|00085|binding|INFO|Claiming lport c1d0d784-50e4-485f-b045-97036b41371f for this chassis.
Oct 02 08:17:38 compute-0 ovn_controller[94821]: 2025-10-02T08:17:38Z|00086|binding|INFO|c1d0d784-50e4-485f-b045-97036b41371f: Claiming fa:16:3e:fc:5e:54 10.100.0.10
Oct 02 08:17:38 compute-0 ovn_controller[94821]: 2025-10-02T08:17:38Z|00087|binding|INFO|Setting lport c1d0d784-50e4-485f-b045-97036b41371f up in Southbound
Oct 02 08:17:38 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:38.978 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:5e:54 10.100.0.10'], port_security=['fa:16:3e:fc:5e:54 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd8d2f2de-031b-4ba3-8896-64dd2578899e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ac58297e5b44744976c58f773f94090', 'neutron:revision_number': '11', 'neutron:security_group_ids': '92c02662-21d7-4fe7-9c02-e6a0bb798f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=176593bb-df9e-44fd-86b3-56aea7ef157a, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=c1d0d784-50e4-485f-b045-97036b41371f) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:17:38 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:38.980 103703 INFO neutron.agent.ovn.metadata.agent [-] Port c1d0d784-50e4-485f-b045-97036b41371f in datapath d2dffba9-387a-40b6-bcfb-049fd17ed68f bound to our chassis
Oct 02 08:17:38 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:38.982 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d2dffba9-387a-40b6-bcfb-049fd17ed68f
Oct 02 08:17:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:38.999 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[0c7a1566-1e84-4c4c-9d41-b353fb571c36]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:39.050 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[273c85c9-3906-4231-9f60-d2a420ba286b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:39.056 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[5c0066f2-ffcf-48e1-9f63-a46131ed2b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:39.106 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[7c19c5a1-2784-41b9-93df-26a3c5716ae6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:39.135 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[550c8455-e688-4b4d-afcb-958710ae931a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2dffba9-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:a1:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 6, 'rx_bytes': 1126, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 6, 'rx_bytes': 1126, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391718, 'reachable_time': 38185, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218405, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:39.157 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[06bf9c2e-a5ac-4399-9fb0-464f26b120ac]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd2dffba9-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391740, 'tstamp': 391740}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218406, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd2dffba9-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391744, 'tstamp': 391744}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218406, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:39.159 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2dffba9-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:39 compute-0 nova_compute[192567]: 2025-10-02 08:17:39.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:39 compute-0 nova_compute[192567]: 2025-10-02 08:17:39.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:39.163 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2dffba9-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:39.163 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:17:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:39.164 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd2dffba9-30, col_values=(('external_ids', {'iface-id': 'e3ee8aeb-cc58-469b-9f75-ef53474d1d07'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:39.164 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:17:39 compute-0 nova_compute[192567]: 2025-10-02 08:17:39.166 2 INFO nova.compute.manager [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Post operation of migration started
Oct 02 08:17:39 compute-0 nova_compute[192567]: 2025-10-02 08:17:39.872 2 DEBUG oslo_concurrency.lockutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-d8d2f2de-031b-4ba3-8896-64dd2578899e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:17:39 compute-0 nova_compute[192567]: 2025-10-02 08:17:39.873 2 DEBUG oslo_concurrency.lockutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-d8d2f2de-031b-4ba3-8896-64dd2578899e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:17:39 compute-0 nova_compute[192567]: 2025-10-02 08:17:39.874 2 DEBUG nova.network.neutron [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:17:41 compute-0 nova_compute[192567]: 2025-10-02 08:17:41.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:41 compute-0 podman[218407]: 2025-10-02 08:17:41.204857374 +0000 UTC m=+0.110795587 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 02 08:17:41 compute-0 nova_compute[192567]: 2025-10-02 08:17:41.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:41 compute-0 nova_compute[192567]: 2025-10-02 08:17:41.311 2 DEBUG nova.network.neutron [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Updating instance_info_cache with network_info: [{"id": "c1d0d784-50e4-485f-b045-97036b41371f", "address": "fa:16:3e:fc:5e:54", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d0d784-50", "ovs_interfaceid": "c1d0d784-50e4-485f-b045-97036b41371f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:17:41 compute-0 nova_compute[192567]: 2025-10-02 08:17:41.370 2 DEBUG oslo_concurrency.lockutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-d8d2f2de-031b-4ba3-8896-64dd2578899e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:17:41 compute-0 nova_compute[192567]: 2025-10-02 08:17:41.391 2 DEBUG oslo_concurrency.lockutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:41 compute-0 nova_compute[192567]: 2025-10-02 08:17:41.391 2 DEBUG oslo_concurrency.lockutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:41 compute-0 nova_compute[192567]: 2025-10-02 08:17:41.392 2 DEBUG oslo_concurrency.lockutils [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:41 compute-0 nova_compute[192567]: 2025-10-02 08:17:41.396 2 INFO nova.virt.libvirt.driver [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 02 08:17:41 compute-0 virtqemud[192112]: Domain id=8 name='instance-00000009' uuid=d8d2f2de-031b-4ba3-8896-64dd2578899e is tainted: custom-monitor
Oct 02 08:17:42 compute-0 nova_compute[192567]: 2025-10-02 08:17:42.405 2 INFO nova.virt.libvirt.driver [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 02 08:17:43 compute-0 nova_compute[192567]: 2025-10-02 08:17:43.415 2 INFO nova.virt.libvirt.driver [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 02 08:17:43 compute-0 nova_compute[192567]: 2025-10-02 08:17:43.423 2 DEBUG nova.compute.manager [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:17:43 compute-0 nova_compute[192567]: 2025-10-02 08:17:43.445 2 DEBUG nova.objects.instance [None req-61db10b7-a57a-44c6-84fd-cc1013d227a7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Oct 02 08:17:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:45.975 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:45.976 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:45.977 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:46 compute-0 nova_compute[192567]: 2025-10-02 08:17:46.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:46 compute-0 nova_compute[192567]: 2025-10-02 08:17:46.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:46 compute-0 nova_compute[192567]: 2025-10-02 08:17:46.993 2 DEBUG oslo_concurrency.lockutils [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:46 compute-0 nova_compute[192567]: 2025-10-02 08:17:46.994 2 DEBUG oslo_concurrency.lockutils [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:46 compute-0 nova_compute[192567]: 2025-10-02 08:17:46.994 2 DEBUG oslo_concurrency.lockutils [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:46 compute-0 nova_compute[192567]: 2025-10-02 08:17:46.995 2 DEBUG oslo_concurrency.lockutils [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:46 compute-0 nova_compute[192567]: 2025-10-02 08:17:46.995 2 DEBUG oslo_concurrency.lockutils [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:46 compute-0 nova_compute[192567]: 2025-10-02 08:17:46.997 2 INFO nova.compute.manager [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Terminating instance
Oct 02 08:17:46 compute-0 nova_compute[192567]: 2025-10-02 08:17:46.999 2 DEBUG nova.compute.manager [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:17:47 compute-0 kernel: tap0e5ee8e9-5d (unregistering): left promiscuous mode
Oct 02 08:17:47 compute-0 NetworkManager[51654]: <info>  [1759393067.0260] device (tap0e5ee8e9-5d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:17:47 compute-0 ovn_controller[94821]: 2025-10-02T08:17:47Z|00088|binding|INFO|Releasing lport 0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb from this chassis (sb_readonly=0)
Oct 02 08:17:47 compute-0 ovn_controller[94821]: 2025-10-02T08:17:47Z|00089|binding|INFO|Setting lport 0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb down in Southbound
Oct 02 08:17:47 compute-0 ovn_controller[94821]: 2025-10-02T08:17:47Z|00090|binding|INFO|Removing iface tap0e5ee8e9-5d ovn-installed in OVS
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.050 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:a6:f2 10.100.0.9'], port_security=['fa:16:3e:d6:a6:f2 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2d4e4e51-5053-4e7b-896e-526ac0cdc1a6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ac58297e5b44744976c58f773f94090', 'neutron:revision_number': '4', 'neutron:security_group_ids': '92c02662-21d7-4fe7-9c02-e6a0bb798f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=176593bb-df9e-44fd-86b3-56aea7ef157a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.053 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb in datapath d2dffba9-387a-40b6-bcfb-049fd17ed68f unbound from our chassis
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.055 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d2dffba9-387a-40b6-bcfb-049fd17ed68f
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.078 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ff0eba-8367-491e-bf09-061386e1e2fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:47 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Oct 02 08:17:47 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Consumed 13.532s CPU time.
Oct 02 08:17:47 compute-0 systemd-machined[152597]: Machine qemu-7-instance-0000000a terminated.
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.126 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[a03d5fd1-a8fe-4573-abeb-f9dae560e539]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.130 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2f831c-1970-4b02-ab95-df813de35ee4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.172 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[c706bad4-b05d-4f52-bee0-274ed1eb9f6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.199 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[3dcf4667-a467-4db3-98be-803f245e7d6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2dffba9-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:a1:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391718, 'reachable_time': 38185, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218442, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.225 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[af1af684-c0ba-4f2d-a7b9-1db230a44187]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd2dffba9-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391740, 'tstamp': 391740}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218443, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd2dffba9-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 391744, 'tstamp': 391744}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218443, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.228 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2dffba9-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.287 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2dffba9-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.288 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.288 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd2dffba9-30, col_values=(('external_ids', {'iface-id': 'e3ee8aeb-cc58-469b-9f75-ef53474d1d07'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:47.289 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.339 2 INFO nova.virt.libvirt.driver [-] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Instance destroyed successfully.
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.340 2 DEBUG nova.objects.instance [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lazy-loading 'resources' on Instance uuid 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.360 2 DEBUG nova.virt.libvirt.vif [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:16:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1954629454',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1954629454',id=10,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:17:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ac58297e5b44744976c58f773f94090',ramdisk_id='',reservation_id='r-k8hhkw81',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:17:10Z,user_data=None,user_id='5455cae7258940a8926bef2dc2483570',uuid=2d4e4e51-5053-4e7b-896e-526ac0cdc1a6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "address": "fa:16:3e:d6:a6:f2", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e5ee8e9-5d", "ovs_interfaceid": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.361 2 DEBUG nova.network.os_vif_util [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converting VIF {"id": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "address": "fa:16:3e:d6:a6:f2", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e5ee8e9-5d", "ovs_interfaceid": "0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.362 2 DEBUG nova.network.os_vif_util [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d6:a6:f2,bridge_name='br-int',has_traffic_filtering=True,id=0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e5ee8e9-5d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.362 2 DEBUG os_vif [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:a6:f2,bridge_name='br-int',has_traffic_filtering=True,id=0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e5ee8e9-5d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.365 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e5ee8e9-5d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.373 2 INFO os_vif [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:a6:f2,bridge_name='br-int',has_traffic_filtering=True,id=0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e5ee8e9-5d')
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.374 2 INFO nova.virt.libvirt.driver [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Deleting instance files /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6_del
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.375 2 INFO nova.virt.libvirt.driver [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Deletion of /var/lib/nova/instances/2d4e4e51-5053-4e7b-896e-526ac0cdc1a6_del complete
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.435 2 INFO nova.compute.manager [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Took 0.44 seconds to destroy the instance on the hypervisor.
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.436 2 DEBUG oslo.service.loopingcall [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.436 2 DEBUG nova.compute.manager [-] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:17:47 compute-0 nova_compute[192567]: 2025-10-02 08:17:47.437 2 DEBUG nova.network.neutron [-] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.027 2 DEBUG nova.compute.manager [req-e8e80498-5a21-424c-9f22-5d2427a17ed8 req-d17913e8-e34b-4d56-8e18-2818942f5d2b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Received event network-vif-unplugged-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.028 2 DEBUG oslo_concurrency.lockutils [req-e8e80498-5a21-424c-9f22-5d2427a17ed8 req-d17913e8-e34b-4d56-8e18-2818942f5d2b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.028 2 DEBUG oslo_concurrency.lockutils [req-e8e80498-5a21-424c-9f22-5d2427a17ed8 req-d17913e8-e34b-4d56-8e18-2818942f5d2b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.029 2 DEBUG oslo_concurrency.lockutils [req-e8e80498-5a21-424c-9f22-5d2427a17ed8 req-d17913e8-e34b-4d56-8e18-2818942f5d2b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.029 2 DEBUG nova.compute.manager [req-e8e80498-5a21-424c-9f22-5d2427a17ed8 req-d17913e8-e34b-4d56-8e18-2818942f5d2b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] No waiting events found dispatching network-vif-unplugged-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.030 2 DEBUG nova.compute.manager [req-e8e80498-5a21-424c-9f22-5d2427a17ed8 req-d17913e8-e34b-4d56-8e18-2818942f5d2b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Received event network-vif-unplugged-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.239 2 DEBUG nova.network.neutron [-] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.257 2 INFO nova.compute.manager [-] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Took 0.82 seconds to deallocate network for instance.
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.298 2 DEBUG oslo_concurrency.lockutils [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.298 2 DEBUG oslo_concurrency.lockutils [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.359 2 DEBUG nova.compute.manager [req-c311af57-dbd5-4809-86e6-afcd842eaadc req-5036085f-db5a-4a73-971a-fda3899fb2d5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Received event network-vif-deleted-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.412 2 DEBUG nova.compute.provider_tree [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.432 2 DEBUG nova.scheduler.client.report [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.452 2 DEBUG oslo_concurrency.lockutils [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.480 2 INFO nova.scheduler.client.report [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Deleted allocations for instance 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6
Oct 02 08:17:48 compute-0 nova_compute[192567]: 2025-10-02 08:17:48.550 2 DEBUG oslo_concurrency.lockutils [None req-6a4dca1f-9485-4e68-8e77-fbe2636325a1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.556s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.440 2 DEBUG oslo_concurrency.lockutils [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "d8d2f2de-031b-4ba3-8896-64dd2578899e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.440 2 DEBUG oslo_concurrency.lockutils [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "d8d2f2de-031b-4ba3-8896-64dd2578899e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.441 2 DEBUG oslo_concurrency.lockutils [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "d8d2f2de-031b-4ba3-8896-64dd2578899e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.441 2 DEBUG oslo_concurrency.lockutils [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "d8d2f2de-031b-4ba3-8896-64dd2578899e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.441 2 DEBUG oslo_concurrency.lockutils [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "d8d2f2de-031b-4ba3-8896-64dd2578899e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.443 2 INFO nova.compute.manager [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Terminating instance
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.445 2 DEBUG nova.compute.manager [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:17:49 compute-0 kernel: tapc1d0d784-50 (unregistering): left promiscuous mode
Oct 02 08:17:49 compute-0 NetworkManager[51654]: <info>  [1759393069.4828] device (tapc1d0d784-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:17:49 compute-0 ovn_controller[94821]: 2025-10-02T08:17:49Z|00091|binding|INFO|Releasing lport c1d0d784-50e4-485f-b045-97036b41371f from this chassis (sb_readonly=0)
Oct 02 08:17:49 compute-0 ovn_controller[94821]: 2025-10-02T08:17:49Z|00092|binding|INFO|Setting lport c1d0d784-50e4-485f-b045-97036b41371f down in Southbound
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:49 compute-0 ovn_controller[94821]: 2025-10-02T08:17:49Z|00093|binding|INFO|Removing iface tapc1d0d784-50 ovn-installed in OVS
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.499 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:5e:54 10.100.0.10'], port_security=['fa:16:3e:fc:5e:54 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd8d2f2de-031b-4ba3-8896-64dd2578899e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ac58297e5b44744976c58f773f94090', 'neutron:revision_number': '11', 'neutron:security_group_ids': '92c02662-21d7-4fe7-9c02-e6a0bb798f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=176593bb-df9e-44fd-86b3-56aea7ef157a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=c1d0d784-50e4-485f-b045-97036b41371f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.501 103703 INFO neutron.agent.ovn.metadata.agent [-] Port c1d0d784-50e4-485f-b045-97036b41371f in datapath d2dffba9-387a-40b6-bcfb-049fd17ed68f unbound from our chassis
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.503 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d2dffba9-387a-40b6-bcfb-049fd17ed68f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.504 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[46027236-0f79-4ed6-ac57-f5149c5f65d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.505 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f namespace which is not needed anymore
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:49 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000009.scope: Deactivated successfully.
Oct 02 08:17:49 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000009.scope: Consumed 3.110s CPU time.
Oct 02 08:17:49 compute-0 systemd-machined[152597]: Machine qemu-8-instance-00000009 terminated.
Oct 02 08:17:49 compute-0 NetworkManager[51654]: <info>  [1759393069.6720] manager: (tapc1d0d784-50): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:49 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218162]: [NOTICE]   (218174) : haproxy version is 2.8.14-c23fe91
Oct 02 08:17:49 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218162]: [NOTICE]   (218174) : path to executable is /usr/sbin/haproxy
Oct 02 08:17:49 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218162]: [ALERT]    (218174) : Current worker (218180) exited with code 143 (Terminated)
Oct 02 08:17:49 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218162]: [WARNING]  (218174) : All workers exited. Exiting... (0)
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:49 compute-0 systemd[1]: libpod-95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628.scope: Deactivated successfully.
Oct 02 08:17:49 compute-0 podman[218484]: 2025-10-02 08:17:49.694062591 +0000 UTC m=+0.072993852 container died 95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.738 2 INFO nova.virt.libvirt.driver [-] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Instance destroyed successfully.
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.739 2 DEBUG nova.objects.instance [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lazy-loading 'resources' on Instance uuid d8d2f2de-031b-4ba3-8896-64dd2578899e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:17:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628-userdata-shm.mount: Deactivated successfully.
Oct 02 08:17:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-cf73597b1147e06c8e41a4ed7b7cfad4da6d9d7d21efdb54a2e3086b3c3ee7ce-merged.mount: Deactivated successfully.
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.759 2 DEBUG nova.virt.libvirt.vif [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T08:16:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1734897634',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1734897634',id=9,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:16:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ac58297e5b44744976c58f773f94090',ramdisk_id='',reservation_id='r-0rb6n7qe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',clean_attempts='1',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:17:43Z,user_data=None,user_id='5455cae7258940a8926bef2dc2483570',uuid=d8d2f2de-031b-4ba3-8896-64dd2578899e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c1d0d784-50e4-485f-b045-97036b41371f", "address": "fa:16:3e:fc:5e:54", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d0d784-50", "ovs_interfaceid": "c1d0d784-50e4-485f-b045-97036b41371f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.760 2 DEBUG nova.network.os_vif_util [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converting VIF {"id": "c1d0d784-50e4-485f-b045-97036b41371f", "address": "fa:16:3e:fc:5e:54", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d0d784-50", "ovs_interfaceid": "c1d0d784-50e4-485f-b045-97036b41371f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.761 2 DEBUG nova.network.os_vif_util [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fc:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=c1d0d784-50e4-485f-b045-97036b41371f,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d0d784-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.762 2 DEBUG os_vif [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=c1d0d784-50e4-485f-b045-97036b41371f,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d0d784-50') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.764 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1d0d784-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:49 compute-0 podman[218484]: 2025-10-02 08:17:49.770099777 +0000 UTC m=+0.149031008 container cleanup 95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true)
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.775 2 INFO os_vif [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:5e:54,bridge_name='br-int',has_traffic_filtering=True,id=c1d0d784-50e4-485f-b045-97036b41371f,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d0d784-50')
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.776 2 INFO nova.virt.libvirt.driver [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Deleting instance files /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e_del
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.777 2 INFO nova.virt.libvirt.driver [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Deletion of /var/lib/nova/instances/d8d2f2de-031b-4ba3-8896-64dd2578899e_del complete
Oct 02 08:17:49 compute-0 systemd[1]: libpod-conmon-95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628.scope: Deactivated successfully.
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.828 2 INFO nova.compute.manager [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Took 0.38 seconds to destroy the instance on the hypervisor.
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.829 2 DEBUG oslo.service.loopingcall [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.830 2 DEBUG nova.compute.manager [-] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.830 2 DEBUG nova.network.neutron [-] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:17:49 compute-0 podman[218528]: 2025-10-02 08:17:49.847318919 +0000 UTC m=+0.049744468 container remove 95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.856 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[73d60256-1263-49cc-8edd-777a4dad03b9]: (4, ('Thu Oct  2 08:17:49 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f (95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628)\n95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628\nThu Oct  2 08:17:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f (95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628)\n95f051a19c0dd73cb90b3a1df0f2137cffa24fd1ce3ac816d65ec133c02bb628\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.859 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[75ba09a2-885e-41b3-b26a-42175487c27f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.860 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2dffba9-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:49 compute-0 kernel: tapd2dffba9-30: left promiscuous mode
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.872 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[0aacc469-774c-41df-afc0-ac754b760247]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:49 compute-0 nova_compute[192567]: 2025-10-02 08:17:49.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.914 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[28abf627-7dd3-45d3-b744-360ba2808b21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.916 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3d8397-197e-4f01-83a1-1ecb1beb86e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.942 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a1029a-eded-4882-81ef-0a7603d06546]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 391707, 'reachable_time': 20548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218543, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:49 compute-0 systemd[1]: run-netns-ovnmeta\x2dd2dffba9\x2d387a\x2d40b6\x2dbcfb\x2d049fd17ed68f.mount: Deactivated successfully.
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.948 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:17:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:17:49.949 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[1c4ee49f-6d39-4421-bf08-16fbc304e67f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.201 2 DEBUG nova.compute.manager [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Received event network-vif-plugged-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.202 2 DEBUG oslo_concurrency.lockutils [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.203 2 DEBUG oslo_concurrency.lockutils [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.203 2 DEBUG oslo_concurrency.lockutils [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2d4e4e51-5053-4e7b-896e-526ac0cdc1a6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.204 2 DEBUG nova.compute.manager [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] No waiting events found dispatching network-vif-plugged-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.204 2 WARNING nova.compute.manager [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Received unexpected event network-vif-plugged-0e5ee8e9-5d17-415d-b4c9-1ba25ca87dfb for instance with vm_state deleted and task_state None.
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.205 2 DEBUG nova.compute.manager [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Received event network-vif-unplugged-c1d0d784-50e4-485f-b045-97036b41371f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.205 2 DEBUG oslo_concurrency.lockutils [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "d8d2f2de-031b-4ba3-8896-64dd2578899e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.205 2 DEBUG oslo_concurrency.lockutils [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "d8d2f2de-031b-4ba3-8896-64dd2578899e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.206 2 DEBUG oslo_concurrency.lockutils [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "d8d2f2de-031b-4ba3-8896-64dd2578899e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.206 2 DEBUG nova.compute.manager [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] No waiting events found dispatching network-vif-unplugged-c1d0d784-50e4-485f-b045-97036b41371f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.207 2 DEBUG nova.compute.manager [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Received event network-vif-unplugged-c1d0d784-50e4-485f-b045-97036b41371f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.207 2 DEBUG nova.compute.manager [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Received event network-vif-plugged-c1d0d784-50e4-485f-b045-97036b41371f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.208 2 DEBUG oslo_concurrency.lockutils [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "d8d2f2de-031b-4ba3-8896-64dd2578899e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.208 2 DEBUG oslo_concurrency.lockutils [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "d8d2f2de-031b-4ba3-8896-64dd2578899e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.208 2 DEBUG oslo_concurrency.lockutils [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "d8d2f2de-031b-4ba3-8896-64dd2578899e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.209 2 DEBUG nova.compute.manager [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] No waiting events found dispatching network-vif-plugged-c1d0d784-50e4-485f-b045-97036b41371f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.209 2 WARNING nova.compute.manager [req-562d9e3e-c197-45e2-b5ac-52ce515c29fe req-8e214393-7055-462d-9e68-c18725431d81 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Received unexpected event network-vif-plugged-c1d0d784-50e4-485f-b045-97036b41371f for instance with vm_state active and task_state deleting.
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.382 2 DEBUG nova.network.neutron [-] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.431 2 INFO nova.compute.manager [-] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Took 0.60 seconds to deallocate network for instance.
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.465 2 DEBUG nova.compute.manager [req-1c7087ae-0992-4cd2-b85f-a72a852fe05b req-4b72003d-60f6-4d62-91a5-7e48b154e268 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Received event network-vif-deleted-c1d0d784-50e4-485f-b045-97036b41371f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.490 2 DEBUG oslo_concurrency.lockutils [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.491 2 DEBUG oslo_concurrency.lockutils [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.496 2 DEBUG oslo_concurrency.lockutils [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.517 2 INFO nova.scheduler.client.report [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Deleted allocations for instance d8d2f2de-031b-4ba3-8896-64dd2578899e
Oct 02 08:17:50 compute-0 nova_compute[192567]: 2025-10-02 08:17:50.619 2 DEBUG oslo_concurrency.lockutils [None req-a089b5d7-0ce9-4346-9bea-156f9679d557 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "d8d2f2de-031b-4ba3-8896-64dd2578899e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:17:51 compute-0 nova_compute[192567]: 2025-10-02 08:17:51.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:52 compute-0 podman[218545]: 2025-10-02 08:17:52.183958753 +0000 UTC m=+0.083457278 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent)
Oct 02 08:17:52 compute-0 podman[218547]: 2025-10-02 08:17:52.203201492 +0000 UTC m=+0.097061991 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3)
Oct 02 08:17:52 compute-0 podman[218546]: 2025-10-02 08:17:52.253128305 +0000 UTC m=+0.152730653 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2)
Oct 02 08:17:54 compute-0 podman[218605]: 2025-10-02 08:17:54.18573878 +0000 UTC m=+0.091988883 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:17:54 compute-0 nova_compute[192567]: 2025-10-02 08:17:54.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:56 compute-0 nova_compute[192567]: 2025-10-02 08:17:56.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:17:57 compute-0 podman[218627]: 2025-10-02 08:17:57.161738143 +0000 UTC m=+0.076936794 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:17:59 compute-0 podman[203011]: time="2025-10-02T08:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:17:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:17:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2995 "" "Go-http-client/1.1"
Oct 02 08:17:59 compute-0 nova_compute[192567]: 2025-10-02 08:17:59.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:01 compute-0 nova_compute[192567]: 2025-10-02 08:18:01.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:01 compute-0 openstack_network_exporter[205118]: ERROR   08:18:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:18:01 compute-0 openstack_network_exporter[205118]: ERROR   08:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:18:01 compute-0 openstack_network_exporter[205118]: ERROR   08:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:18:01 compute-0 openstack_network_exporter[205118]: ERROR   08:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:18:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:18:01 compute-0 openstack_network_exporter[205118]: ERROR   08:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:18:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:18:02 compute-0 nova_compute[192567]: 2025-10-02 08:18:02.335 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759393067.3347263, 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:18:02 compute-0 nova_compute[192567]: 2025-10-02 08:18:02.336 2 INFO nova.compute.manager [-] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] VM Stopped (Lifecycle Event)
Oct 02 08:18:02 compute-0 nova_compute[192567]: 2025-10-02 08:18:02.380 2 DEBUG nova.compute.manager [None req-222e2d80-5fd0-4646-9776-8985e7bcb7d4 - - - - - -] [instance: 2d4e4e51-5053-4e7b-896e-526ac0cdc1a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:18:04 compute-0 nova_compute[192567]: 2025-10-02 08:18:04.735 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759393069.734115, d8d2f2de-031b-4ba3-8896-64dd2578899e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:18:04 compute-0 nova_compute[192567]: 2025-10-02 08:18:04.736 2 INFO nova.compute.manager [-] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] VM Stopped (Lifecycle Event)
Oct 02 08:18:04 compute-0 nova_compute[192567]: 2025-10-02 08:18:04.766 2 DEBUG nova.compute.manager [None req-0e948e7a-21c7-4d44-9332-bbbe4da79112 - - - - - -] [instance: d8d2f2de-031b-4ba3-8896-64dd2578899e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:18:04 compute-0 nova_compute[192567]: 2025-10-02 08:18:04.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:06 compute-0 nova_compute[192567]: 2025-10-02 08:18:06.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:09 compute-0 nova_compute[192567]: 2025-10-02 08:18:09.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:11 compute-0 nova_compute[192567]: 2025-10-02 08:18:11.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:12 compute-0 podman[218651]: 2025-10-02 08:18:12.153435632 +0000 UTC m=+0.072784125 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Oct 02 08:18:14 compute-0 nova_compute[192567]: 2025-10-02 08:18:14.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:16 compute-0 nova_compute[192567]: 2025-10-02 08:18:16.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:19 compute-0 nova_compute[192567]: 2025-10-02 08:18:19.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:20 compute-0 ovn_controller[94821]: 2025-10-02T08:18:20Z|00094|memory_trim|INFO|Detected inactivity (last active 30020 ms ago): trimming memory
Oct 02 08:18:21 compute-0 nova_compute[192567]: 2025-10-02 08:18:21.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:23 compute-0 podman[218677]: 2025-10-02 08:18:23.212671101 +0000 UTC m=+0.088816185 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:18:23 compute-0 podman[218675]: 2025-10-02 08:18:23.217306705 +0000 UTC m=+0.115371830 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent)
Oct 02 08:18:23 compute-0 podman[218676]: 2025-10-02 08:18:23.280030316 +0000 UTC m=+0.165653945 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:18:24 compute-0 nova_compute[192567]: 2025-10-02 08:18:24.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:25 compute-0 podman[218738]: 2025-10-02 08:18:25.168477106 +0000 UTC m=+0.079887037 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 02 08:18:25 compute-0 nova_compute[192567]: 2025-10-02 08:18:25.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:18:25 compute-0 nova_compute[192567]: 2025-10-02 08:18:25.658 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:18:25 compute-0 nova_compute[192567]: 2025-10-02 08:18:25.659 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:18:25 compute-0 nova_compute[192567]: 2025-10-02 08:18:25.659 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:18:25 compute-0 nova_compute[192567]: 2025-10-02 08:18:25.659 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:18:25 compute-0 nova_compute[192567]: 2025-10-02 08:18:25.915 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:18:25 compute-0 nova_compute[192567]: 2025-10-02 08:18:25.916 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5901MB free_disk=73.46571731567383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:18:25 compute-0 nova_compute[192567]: 2025-10-02 08:18:25.916 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:18:25 compute-0 nova_compute[192567]: 2025-10-02 08:18:25.917 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:18:25 compute-0 nova_compute[192567]: 2025-10-02 08:18:25.997 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:18:25 compute-0 nova_compute[192567]: 2025-10-02 08:18:25.997 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:18:26 compute-0 nova_compute[192567]: 2025-10-02 08:18:26.016 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:18:26 compute-0 nova_compute[192567]: 2025-10-02 08:18:26.029 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:18:26 compute-0 nova_compute[192567]: 2025-10-02 08:18:26.053 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:18:26 compute-0 nova_compute[192567]: 2025-10-02 08:18:26.054 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:18:26 compute-0 nova_compute[192567]: 2025-10-02 08:18:26.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:28 compute-0 nova_compute[192567]: 2025-10-02 08:18:28.054 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:18:28 compute-0 nova_compute[192567]: 2025-10-02 08:18:28.055 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:18:28 compute-0 nova_compute[192567]: 2025-10-02 08:18:28.055 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:18:28 compute-0 nova_compute[192567]: 2025-10-02 08:18:28.075 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:18:28 compute-0 podman[218759]: 2025-10-02 08:18:28.181685678 +0000 UTC m=+0.085126049 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:18:28 compute-0 nova_compute[192567]: 2025-10-02 08:18:28.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:18:29 compute-0 nova_compute[192567]: 2025-10-02 08:18:29.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:18:29 compute-0 podman[203011]: time="2025-10-02T08:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:18:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:18:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2999 "" "Go-http-client/1.1"
Oct 02 08:18:29 compute-0 nova_compute[192567]: 2025-10-02 08:18:29.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:31 compute-0 nova_compute[192567]: 2025-10-02 08:18:31.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:31 compute-0 openstack_network_exporter[205118]: ERROR   08:18:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:18:31 compute-0 openstack_network_exporter[205118]: ERROR   08:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:18:31 compute-0 openstack_network_exporter[205118]: ERROR   08:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:18:31 compute-0 openstack_network_exporter[205118]: ERROR   08:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:18:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:18:31 compute-0 openstack_network_exporter[205118]: ERROR   08:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:18:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:18:32 compute-0 nova_compute[192567]: 2025-10-02 08:18:32.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:18:33 compute-0 nova_compute[192567]: 2025-10-02 08:18:33.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:18:33 compute-0 nova_compute[192567]: 2025-10-02 08:18:33.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:18:34 compute-0 nova_compute[192567]: 2025-10-02 08:18:34.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:35 compute-0 nova_compute[192567]: 2025-10-02 08:18:35.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:18:35 compute-0 nova_compute[192567]: 2025-10-02 08:18:35.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:18:35 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:35.981 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:18:35 compute-0 nova_compute[192567]: 2025-10-02 08:18:35.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:35 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:35.983 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:18:36 compute-0 nova_compute[192567]: 2025-10-02 08:18:36.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:39 compute-0 nova_compute[192567]: 2025-10-02 08:18:39.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:40 compute-0 nova_compute[192567]: 2025-10-02 08:18:40.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:18:41 compute-0 nova_compute[192567]: 2025-10-02 08:18:41.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:42 compute-0 nova_compute[192567]: 2025-10-02 08:18:42.666 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:18:42 compute-0 nova_compute[192567]: 2025-10-02 08:18:42.667 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:18:42 compute-0 nova_compute[192567]: 2025-10-02 08:18:42.693 2 DEBUG nova.compute.manager [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 02 08:18:42 compute-0 nova_compute[192567]: 2025-10-02 08:18:42.796 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:18:42 compute-0 nova_compute[192567]: 2025-10-02 08:18:42.797 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:18:42 compute-0 nova_compute[192567]: 2025-10-02 08:18:42.806 2 DEBUG nova.virt.hardware [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:18:42 compute-0 nova_compute[192567]: 2025-10-02 08:18:42.807 2 INFO nova.compute.claims [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:18:42 compute-0 nova_compute[192567]: 2025-10-02 08:18:42.953 2 DEBUG nova.compute.provider_tree [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:18:42 compute-0 nova_compute[192567]: 2025-10-02 08:18:42.973 2 DEBUG nova.scheduler.client.report [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.002 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.003 2 DEBUG nova.compute.manager [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.096 2 DEBUG nova.compute.manager [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.097 2 DEBUG nova.network.neutron [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.120 2 INFO nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.141 2 DEBUG nova.compute.manager [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 02 08:18:43 compute-0 podman[218784]: 2025-10-02 08:18:43.190981675 +0000 UTC m=+0.101549500 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6)
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.230 2 DEBUG nova.compute.manager [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.233 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.233 2 INFO nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Creating image(s)
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.234 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "/var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.235 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "/var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.236 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "/var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.260 2 DEBUG oslo_concurrency.processutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.353 2 DEBUG oslo_concurrency.processutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.355 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.356 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.377 2 DEBUG oslo_concurrency.processutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.459 2 DEBUG oslo_concurrency.processutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.460 2 DEBUG oslo_concurrency.processutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.509 2 DEBUG oslo_concurrency.processutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.511 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.512 2 DEBUG oslo_concurrency.processutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.599 2 DEBUG oslo_concurrency.processutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.600 2 DEBUG nova.virt.disk.api [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Checking if we can resize image /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.601 2 DEBUG oslo_concurrency.processutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.695 2 DEBUG oslo_concurrency.processutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.699 2 DEBUG nova.virt.disk.api [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Cannot resize image /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.700 2 DEBUG nova.objects.instance [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lazy-loading 'migration_context' on Instance uuid 2992d4a5-e893-4c34-99fc-a7c5455d37f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.722 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.723 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Ensure instance console log exists: /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.724 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.725 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:18:43 compute-0 nova_compute[192567]: 2025-10-02 08:18:43.725 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:18:43 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:43.986 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:18:44 compute-0 nova_compute[192567]: 2025-10-02 08:18:44.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:44 compute-0 nova_compute[192567]: 2025-10-02 08:18:44.977 2 DEBUG nova.network.neutron [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Successfully created port: 90dae37e-3d1c-4776-99d8-6dc1921fb7ba _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 02 08:18:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:45.975 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:18:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:45.976 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:18:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:45.976 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:18:46 compute-0 nova_compute[192567]: 2025-10-02 08:18:46.120 2 DEBUG nova.network.neutron [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Successfully updated port: 90dae37e-3d1c-4776-99d8-6dc1921fb7ba _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 02 08:18:46 compute-0 nova_compute[192567]: 2025-10-02 08:18:46.141 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "refresh_cache-2992d4a5-e893-4c34-99fc-a7c5455d37f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:18:46 compute-0 nova_compute[192567]: 2025-10-02 08:18:46.141 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquired lock "refresh_cache-2992d4a5-e893-4c34-99fc-a7c5455d37f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:18:46 compute-0 nova_compute[192567]: 2025-10-02 08:18:46.142 2 DEBUG nova.network.neutron [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:18:46 compute-0 nova_compute[192567]: 2025-10-02 08:18:46.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:46 compute-0 nova_compute[192567]: 2025-10-02 08:18:46.254 2 DEBUG nova.compute.manager [req-b2f07f51-1981-4f89-b09e-3aa30f581147 req-305d4152-d438-4f68-b242-b5cd87c80f0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Received event network-changed-90dae37e-3d1c-4776-99d8-6dc1921fb7ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:18:46 compute-0 nova_compute[192567]: 2025-10-02 08:18:46.255 2 DEBUG nova.compute.manager [req-b2f07f51-1981-4f89-b09e-3aa30f581147 req-305d4152-d438-4f68-b242-b5cd87c80f0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Refreshing instance network info cache due to event network-changed-90dae37e-3d1c-4776-99d8-6dc1921fb7ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:18:46 compute-0 nova_compute[192567]: 2025-10-02 08:18:46.256 2 DEBUG oslo_concurrency.lockutils [req-b2f07f51-1981-4f89-b09e-3aa30f581147 req-305d4152-d438-4f68-b242-b5cd87c80f0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-2992d4a5-e893-4c34-99fc-a7c5455d37f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:18:46 compute-0 nova_compute[192567]: 2025-10-02 08:18:46.330 2 DEBUG nova.network.neutron [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.544 2 DEBUG nova.network.neutron [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Updating instance_info_cache with network_info: [{"id": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "address": "fa:16:3e:ce:cb:87", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90dae37e-3d", "ovs_interfaceid": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.570 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Releasing lock "refresh_cache-2992d4a5-e893-4c34-99fc-a7c5455d37f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.571 2 DEBUG nova.compute.manager [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Instance network_info: |[{"id": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "address": "fa:16:3e:ce:cb:87", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90dae37e-3d", "ovs_interfaceid": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.572 2 DEBUG oslo_concurrency.lockutils [req-b2f07f51-1981-4f89-b09e-3aa30f581147 req-305d4152-d438-4f68-b242-b5cd87c80f0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-2992d4a5-e893-4c34-99fc-a7c5455d37f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.572 2 DEBUG nova.network.neutron [req-b2f07f51-1981-4f89-b09e-3aa30f581147 req-305d4152-d438-4f68-b242-b5cd87c80f0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Refreshing network info cache for port 90dae37e-3d1c-4776-99d8-6dc1921fb7ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.578 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Start _get_guest_xml network_info=[{"id": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "address": "fa:16:3e:ce:cb:87", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90dae37e-3d", "ovs_interfaceid": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.585 2 WARNING nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.590 2 DEBUG nova.virt.libvirt.host [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.591 2 DEBUG nova.virt.libvirt.host [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.596 2 DEBUG nova.virt.libvirt.host [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.597 2 DEBUG nova.virt.libvirt.host [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.598 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.598 2 DEBUG nova.virt.hardware [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.599 2 DEBUG nova.virt.hardware [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.600 2 DEBUG nova.virt.hardware [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.600 2 DEBUG nova.virt.hardware [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.600 2 DEBUG nova.virt.hardware [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.601 2 DEBUG nova.virt.hardware [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.601 2 DEBUG nova.virt.hardware [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.602 2 DEBUG nova.virt.hardware [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.602 2 DEBUG nova.virt.hardware [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.603 2 DEBUG nova.virt.hardware [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.603 2 DEBUG nova.virt.hardware [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.609 2 DEBUG nova.virt.libvirt.vif [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:18:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-398970798',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-398970798',id=12,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ac58297e5b44744976c58f773f94090',ramdisk_id='',reservation_id='r-oyqi3b90',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:18:43Z,user_data=None,user_id='5455cae7258940a8926bef2dc2483570',uuid=2992d4a5-e893-4c34-99fc-a7c5455d37f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "address": "fa:16:3e:ce:cb:87", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90dae37e-3d", "ovs_interfaceid": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.610 2 DEBUG nova.network.os_vif_util [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converting VIF {"id": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "address": "fa:16:3e:ce:cb:87", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90dae37e-3d", "ovs_interfaceid": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.611 2 DEBUG nova.network.os_vif_util [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:cb:87,bridge_name='br-int',has_traffic_filtering=True,id=90dae37e-3d1c-4776-99d8-6dc1921fb7ba,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90dae37e-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.613 2 DEBUG nova.objects.instance [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2992d4a5-e893-4c34-99fc-a7c5455d37f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.631 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:18:47 compute-0 nova_compute[192567]:   <uuid>2992d4a5-e893-4c34-99fc-a7c5455d37f6</uuid>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   <name>instance-0000000c</name>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-398970798</nova:name>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:18:47</nova:creationTime>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:18:47 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:18:47 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:18:47 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:18:47 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:18:47 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:18:47 compute-0 nova_compute[192567]:         <nova:user uuid="5455cae7258940a8926bef2dc2483570">tempest-TestExecuteHostMaintenanceStrategy-1763362073-project-admin</nova:user>
Oct 02 08:18:47 compute-0 nova_compute[192567]:         <nova:project uuid="7ac58297e5b44744976c58f773f94090">tempest-TestExecuteHostMaintenanceStrategy-1763362073</nova:project>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:18:47 compute-0 nova_compute[192567]:         <nova:port uuid="90dae37e-3d1c-4776-99d8-6dc1921fb7ba">
Oct 02 08:18:47 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <system>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <entry name="serial">2992d4a5-e893-4c34-99fc-a7c5455d37f6</entry>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <entry name="uuid">2992d4a5-e893-4c34-99fc-a7c5455d37f6</entry>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     </system>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   <os>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   </os>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   <features>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   </features>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk.config"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:ce:cb:87"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <target dev="tap90dae37e-3d"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/console.log" append="off"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <video>
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     </video>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:18:47 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:18:47 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:18:47 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:18:47 compute-0 nova_compute[192567]: </domain>
Oct 02 08:18:47 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.632 2 DEBUG nova.compute.manager [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Preparing to wait for external event network-vif-plugged-90dae37e-3d1c-4776-99d8-6dc1921fb7ba prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.632 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.633 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.633 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.634 2 DEBUG nova.virt.libvirt.vif [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:18:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-398970798',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-398970798',id=12,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ac58297e5b44744976c58f773f94090',ramdisk_id='',reservation_id='r-oyqi3b90',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:18:43Z,user_data=None,user_id='5455cae7258940a8926bef2dc2483570',uuid=2992d4a5-e893-4c34-99fc-a7c5455d37f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "address": "fa:16:3e:ce:cb:87", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90dae37e-3d", "ovs_interfaceid": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.635 2 DEBUG nova.network.os_vif_util [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converting VIF {"id": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "address": "fa:16:3e:ce:cb:87", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90dae37e-3d", "ovs_interfaceid": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.636 2 DEBUG nova.network.os_vif_util [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ce:cb:87,bridge_name='br-int',has_traffic_filtering=True,id=90dae37e-3d1c-4776-99d8-6dc1921fb7ba,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90dae37e-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.636 2 DEBUG os_vif [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:cb:87,bridge_name='br-int',has_traffic_filtering=True,id=90dae37e-3d1c-4776-99d8-6dc1921fb7ba,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90dae37e-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.637 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.638 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.643 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90dae37e-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.643 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap90dae37e-3d, col_values=(('external_ids', {'iface-id': '90dae37e-3d1c-4776-99d8-6dc1921fb7ba', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ce:cb:87', 'vm-uuid': '2992d4a5-e893-4c34-99fc-a7c5455d37f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:47 compute-0 NetworkManager[51654]: <info>  [1759393127.6479] manager: (tap90dae37e-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.657 2 INFO os_vif [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ce:cb:87,bridge_name='br-int',has_traffic_filtering=True,id=90dae37e-3d1c-4776-99d8-6dc1921fb7ba,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90dae37e-3d')
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.725 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.726 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.726 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] No VIF found with MAC fa:16:3e:ce:cb:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:18:47 compute-0 nova_compute[192567]: 2025-10-02 08:18:47.727 2 INFO nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Using config drive
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.121 2 INFO nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Creating config drive at /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk.config
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.129 2 DEBUG oslo_concurrency.processutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp423qzsgx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.277 2 DEBUG oslo_concurrency.processutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp423qzsgx" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:18:48 compute-0 kernel: tap90dae37e-3d: entered promiscuous mode
Oct 02 08:18:48 compute-0 NetworkManager[51654]: <info>  [1759393128.3669] manager: (tap90dae37e-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Oct 02 08:18:48 compute-0 ovn_controller[94821]: 2025-10-02T08:18:48Z|00095|binding|INFO|Claiming lport 90dae37e-3d1c-4776-99d8-6dc1921fb7ba for this chassis.
Oct 02 08:18:48 compute-0 ovn_controller[94821]: 2025-10-02T08:18:48Z|00096|binding|INFO|90dae37e-3d1c-4776-99d8-6dc1921fb7ba: Claiming fa:16:3e:ce:cb:87 10.100.0.8
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.381 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:cb:87 10.100.0.8'], port_security=['fa:16:3e:ce:cb:87 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '2992d4a5-e893-4c34-99fc-a7c5455d37f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ac58297e5b44744976c58f773f94090', 'neutron:revision_number': '2', 'neutron:security_group_ids': '92c02662-21d7-4fe7-9c02-e6a0bb798f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=176593bb-df9e-44fd-86b3-56aea7ef157a, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=90dae37e-3d1c-4776-99d8-6dc1921fb7ba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.383 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 90dae37e-3d1c-4776-99d8-6dc1921fb7ba in datapath d2dffba9-387a-40b6-bcfb-049fd17ed68f bound to our chassis
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.385 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d2dffba9-387a-40b6-bcfb-049fd17ed68f
Oct 02 08:18:48 compute-0 ovn_controller[94821]: 2025-10-02T08:18:48Z|00097|binding|INFO|Setting lport 90dae37e-3d1c-4776-99d8-6dc1921fb7ba ovn-installed in OVS
Oct 02 08:18:48 compute-0 ovn_controller[94821]: 2025-10-02T08:18:48Z|00098|binding|INFO|Setting lport 90dae37e-3d1c-4776-99d8-6dc1921fb7ba up in Southbound
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.406 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[29bc5aa6-4171-4652-8aab-168a1a415313]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.407 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd2dffba9-31 in ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.411 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd2dffba9-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.412 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[cd7567f7-552a-4b62-a202-7d446620e3bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.413 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[b89f8e26-b243-488a-9102-b04e86034e73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 systemd-udevd[218840]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:18:48 compute-0 systemd-machined[152597]: New machine qemu-9-instance-0000000c.
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.436 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[6ce8acc6-c12b-40ca-807c-7a21855846ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000c.
Oct 02 08:18:48 compute-0 NetworkManager[51654]: <info>  [1759393128.4608] device (tap90dae37e-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:18:48 compute-0 NetworkManager[51654]: <info>  [1759393128.4632] device (tap90dae37e-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.475 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[fb30fbce-d134-4414-b409-8b821a29245e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.531 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[b605c846-609b-4d4f-94e1-a570b2ba7dae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 NetworkManager[51654]: <info>  [1759393128.5424] manager: (tapd2dffba9-30): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.543 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[2275b941-db83-45c5-a636-66e997bc0a15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 systemd-udevd[218843]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.598 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[42ec98b8-243b-4a26-87bc-e7e9083c1626]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.602 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[109e9667-033d-49e0-9deb-14e80bea116c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 NetworkManager[51654]: <info>  [1759393128.6520] device (tapd2dffba9-30): carrier: link connected
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.667 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[128bb69e-2bec-4027-afd6-7c073858215c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.690 2 DEBUG nova.compute.manager [req-94194aed-ee0f-4460-b1e2-27309b08f5ef req-bcdbfddf-826b-4ac6-b3ff-dfa5d1743a7d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Received event network-vif-plugged-90dae37e-3d1c-4776-99d8-6dc1921fb7ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.691 2 DEBUG oslo_concurrency.lockutils [req-94194aed-ee0f-4460-b1e2-27309b08f5ef req-bcdbfddf-826b-4ac6-b3ff-dfa5d1743a7d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.691 2 DEBUG oslo_concurrency.lockutils [req-94194aed-ee0f-4460-b1e2-27309b08f5ef req-bcdbfddf-826b-4ac6-b3ff-dfa5d1743a7d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.691 2 DEBUG oslo_concurrency.lockutils [req-94194aed-ee0f-4460-b1e2-27309b08f5ef req-bcdbfddf-826b-4ac6-b3ff-dfa5d1743a7d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.692 2 DEBUG nova.compute.manager [req-94194aed-ee0f-4460-b1e2-27309b08f5ef req-bcdbfddf-826b-4ac6-b3ff-dfa5d1743a7d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Processing event network-vif-plugged-90dae37e-3d1c-4776-99d8-6dc1921fb7ba _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.694 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[20a3f857-b310-4768-a70d-5a7c3ee9fcc5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2dffba9-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:a1:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401624, 'reachable_time': 37985, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218871, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.723 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3e2ef4-642c-4ca7-a74b-f8b3b725ed56]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe06:a1f9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401624, 'tstamp': 401624}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218872, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.756 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[baf570c0-3584-4a81-94fc-c76bca46d032]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2dffba9-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:a1:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401624, 'reachable_time': 37985, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218873, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.806 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[4d254a2d-ac7a-4962-bb14-46e55952d3df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.909 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[3d9d8d32-783a-41cc-918a-8d586846d21b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.910 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2dffba9-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.911 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.912 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2dffba9-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:48 compute-0 kernel: tapd2dffba9-30: entered promiscuous mode
Oct 02 08:18:48 compute-0 NetworkManager[51654]: <info>  [1759393128.9177] manager: (tapd2dffba9-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.920 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd2dffba9-30, col_values=(('external_ids', {'iface-id': 'e3ee8aeb-cc58-469b-9f75-ef53474d1d07'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:48 compute-0 ovn_controller[94821]: 2025-10-02T08:18:48Z|00099|binding|INFO|Releasing lport e3ee8aeb-cc58-469b-9f75-ef53474d1d07 from this chassis (sb_readonly=0)
Oct 02 08:18:48 compute-0 nova_compute[192567]: 2025-10-02 08:18:48.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.950 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d2dffba9-387a-40b6-bcfb-049fd17ed68f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d2dffba9-387a-40b6-bcfb-049fd17ed68f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.951 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[855576ee-ec0b-45dc-8439-2754a2720ca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.953 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-d2dffba9-387a-40b6-bcfb-049fd17ed68f
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/d2dffba9-387a-40b6-bcfb-049fd17ed68f.pid.haproxy
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID d2dffba9-387a-40b6-bcfb-049fd17ed68f
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:18:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:18:48.954 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'env', 'PROCESS_TAG=haproxy-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d2dffba9-387a-40b6-bcfb-049fd17ed68f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.131 2 DEBUG nova.network.neutron [req-b2f07f51-1981-4f89-b09e-3aa30f581147 req-305d4152-d438-4f68-b242-b5cd87c80f0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Updated VIF entry in instance network info cache for port 90dae37e-3d1c-4776-99d8-6dc1921fb7ba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.133 2 DEBUG nova.network.neutron [req-b2f07f51-1981-4f89-b09e-3aa30f581147 req-305d4152-d438-4f68-b242-b5cd87c80f0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Updating instance_info_cache with network_info: [{"id": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "address": "fa:16:3e:ce:cb:87", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90dae37e-3d", "ovs_interfaceid": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.162 2 DEBUG oslo_concurrency.lockutils [req-b2f07f51-1981-4f89-b09e-3aa30f581147 req-305d4152-d438-4f68-b242-b5cd87c80f0c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-2992d4a5-e893-4c34-99fc-a7c5455d37f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:18:49 compute-0 podman[218912]: 2025-10-02 08:18:49.419502857 +0000 UTC m=+0.072774365 container create 6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 02 08:18:49 compute-0 podman[218912]: 2025-10-02 08:18:49.377904942 +0000 UTC m=+0.031176500 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:18:49 compute-0 systemd[1]: Started libpod-conmon-6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52.scope.
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.488 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393129.4880369, 2992d4a5-e893-4c34-99fc-a7c5455d37f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.489 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] VM Started (Lifecycle Event)
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.492 2 DEBUG nova.compute.manager [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.495 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.499 2 INFO nova.virt.libvirt.driver [-] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Instance spawned successfully.
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.499 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 02 08:18:49 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.517 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:18:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85176e27ed0370be109e2f38d9165906ba2da6858a117c79238bb0558d37261d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.538 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:18:49 compute-0 podman[218912]: 2025-10-02 08:18:49.538963943 +0000 UTC m=+0.192235451 container init 6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.546 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.546 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.547 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.547 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.548 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.548 2 DEBUG nova.virt.libvirt.driver [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:18:49 compute-0 podman[218912]: 2025-10-02 08:18:49.549229002 +0000 UTC m=+0.202500490 container start 6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 02 08:18:49 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218927]: [NOTICE]   (218932) : New worker (218934) forked
Oct 02 08:18:49 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218927]: [NOTICE]   (218932) : Loading success.
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.587 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.587 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393129.4894361, 2992d4a5-e893-4c34-99fc-a7c5455d37f6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.588 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] VM Paused (Lifecycle Event)
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.623 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.629 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393129.495349, 2992d4a5-e893-4c34-99fc-a7c5455d37f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.629 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] VM Resumed (Lifecycle Event)
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.633 2 INFO nova.compute.manager [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Took 6.40 seconds to spawn the instance on the hypervisor.
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.634 2 DEBUG nova.compute.manager [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.666 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.670 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.697 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.707 2 INFO nova.compute.manager [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Took 6.94 seconds to build instance.
Oct 02 08:18:49 compute-0 nova_compute[192567]: 2025-10-02 08:18:49.724 2 DEBUG oslo_concurrency.lockutils [None req-0fac9127-2feb-474a-9b17-fe44e2ca7ca8 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.057s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:18:50 compute-0 nova_compute[192567]: 2025-10-02 08:18:50.800 2 DEBUG nova.compute.manager [req-c78756cf-b6a2-48aa-bf0f-b4332ebf3b1b req-aa8cbde3-988b-4f8d-a1e3-221f61c4f919 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Received event network-vif-plugged-90dae37e-3d1c-4776-99d8-6dc1921fb7ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:18:50 compute-0 nova_compute[192567]: 2025-10-02 08:18:50.802 2 DEBUG oslo_concurrency.lockutils [req-c78756cf-b6a2-48aa-bf0f-b4332ebf3b1b req-aa8cbde3-988b-4f8d-a1e3-221f61c4f919 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:18:50 compute-0 nova_compute[192567]: 2025-10-02 08:18:50.802 2 DEBUG oslo_concurrency.lockutils [req-c78756cf-b6a2-48aa-bf0f-b4332ebf3b1b req-aa8cbde3-988b-4f8d-a1e3-221f61c4f919 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:18:50 compute-0 nova_compute[192567]: 2025-10-02 08:18:50.803 2 DEBUG oslo_concurrency.lockutils [req-c78756cf-b6a2-48aa-bf0f-b4332ebf3b1b req-aa8cbde3-988b-4f8d-a1e3-221f61c4f919 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:18:50 compute-0 nova_compute[192567]: 2025-10-02 08:18:50.803 2 DEBUG nova.compute.manager [req-c78756cf-b6a2-48aa-bf0f-b4332ebf3b1b req-aa8cbde3-988b-4f8d-a1e3-221f61c4f919 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] No waiting events found dispatching network-vif-plugged-90dae37e-3d1c-4776-99d8-6dc1921fb7ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:18:50 compute-0 nova_compute[192567]: 2025-10-02 08:18:50.804 2 WARNING nova.compute.manager [req-c78756cf-b6a2-48aa-bf0f-b4332ebf3b1b req-aa8cbde3-988b-4f8d-a1e3-221f61c4f919 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Received unexpected event network-vif-plugged-90dae37e-3d1c-4776-99d8-6dc1921fb7ba for instance with vm_state active and task_state None.
Oct 02 08:18:51 compute-0 nova_compute[192567]: 2025-10-02 08:18:51.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:52 compute-0 nova_compute[192567]: 2025-10-02 08:18:52.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:54 compute-0 podman[218943]: 2025-10-02 08:18:54.18714074 +0000 UTC m=+0.087739260 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 02 08:18:54 compute-0 podman[218945]: 2025-10-02 08:18:54.221813039 +0000 UTC m=+0.115468713 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:18:54 compute-0 podman[218944]: 2025-10-02 08:18:54.259583594 +0000 UTC m=+0.158110870 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 02 08:18:56 compute-0 podman[219004]: 2025-10-02 08:18:56.170592096 +0000 UTC m=+0.085795799 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 02 08:18:56 compute-0 nova_compute[192567]: 2025-10-02 08:18:56.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:57 compute-0 nova_compute[192567]: 2025-10-02 08:18:57.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:18:59 compute-0 podman[219026]: 2025-10-02 08:18:59.181341972 +0000 UTC m=+0.082850838 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:18:59 compute-0 podman[203011]: time="2025-10-02T08:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:18:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:18:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3470 "" "Go-http-client/1.1"
Oct 02 08:19:00 compute-0 ovn_controller[94821]: 2025-10-02T08:19:00Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ce:cb:87 10.100.0.8
Oct 02 08:19:00 compute-0 ovn_controller[94821]: 2025-10-02T08:19:00Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ce:cb:87 10.100.0.8
Oct 02 08:19:01 compute-0 nova_compute[192567]: 2025-10-02 08:19:01.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:01 compute-0 openstack_network_exporter[205118]: ERROR   08:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:19:01 compute-0 openstack_network_exporter[205118]: ERROR   08:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:19:01 compute-0 openstack_network_exporter[205118]: ERROR   08:19:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:19:01 compute-0 openstack_network_exporter[205118]: ERROR   08:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:19:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:19:01 compute-0 openstack_network_exporter[205118]: ERROR   08:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:19:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:19:02 compute-0 nova_compute[192567]: 2025-10-02 08:19:02.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:03 compute-0 unix_chkpwd[219063]: password check failed for user (root)
Oct 02 08:19:04 compute-0 sshd-session[219061]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Oct 02 08:19:06 compute-0 nova_compute[192567]: 2025-10-02 08:19:06.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:06 compute-0 sshd-session[219061]: Failed password for root from 193.46.255.159 port 40252 ssh2
Oct 02 08:19:07 compute-0 nova_compute[192567]: 2025-10-02 08:19:07.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:08 compute-0 unix_chkpwd[219064]: password check failed for user (root)
Oct 02 08:19:10 compute-0 sshd-session[219061]: Failed password for root from 193.46.255.159 port 40252 ssh2
Oct 02 08:19:11 compute-0 nova_compute[192567]: 2025-10-02 08:19:11.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:12 compute-0 unix_chkpwd[219065]: password check failed for user (root)
Oct 02 08:19:12 compute-0 nova_compute[192567]: 2025-10-02 08:19:12.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:14 compute-0 podman[219066]: 2025-10-02 08:19:14.198571445 +0000 UTC m=+0.098487085 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git)
Oct 02 08:19:14 compute-0 sshd-session[219061]: Failed password for root from 193.46.255.159 port 40252 ssh2
Oct 02 08:19:14 compute-0 sshd-session[219061]: Received disconnect from 193.46.255.159 port 40252:11:  [preauth]
Oct 02 08:19:14 compute-0 sshd-session[219061]: Disconnected from authenticating user root 193.46.255.159 port 40252 [preauth]
Oct 02 08:19:14 compute-0 sshd-session[219061]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Oct 02 08:19:15 compute-0 unix_chkpwd[219089]: password check failed for user (root)
Oct 02 08:19:15 compute-0 sshd-session[219087]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Oct 02 08:19:16 compute-0 nova_compute[192567]: 2025-10-02 08:19:16.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:17 compute-0 nova_compute[192567]: 2025-10-02 08:19:17.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:17 compute-0 sshd-session[219087]: Failed password for root from 193.46.255.159 port 33362 ssh2
Oct 02 08:19:18 compute-0 ovn_controller[94821]: 2025-10-02T08:19:18Z|00100|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Oct 02 08:19:19 compute-0 unix_chkpwd[219091]: password check failed for user (root)
Oct 02 08:19:21 compute-0 nova_compute[192567]: 2025-10-02 08:19:21.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:21 compute-0 sshd-session[219087]: Failed password for root from 193.46.255.159 port 33362 ssh2
Oct 02 08:19:22 compute-0 nova_compute[192567]: 2025-10-02 08:19:22.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:23 compute-0 unix_chkpwd[219092]: password check failed for user (root)
Oct 02 08:19:25 compute-0 podman[219095]: 2025-10-02 08:19:25.194293357 +0000 UTC m=+0.094429648 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:19:25 compute-0 podman[219093]: 2025-10-02 08:19:25.213449443 +0000 UTC m=+0.119896931 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Oct 02 08:19:25 compute-0 podman[219094]: 2025-10-02 08:19:25.225633712 +0000 UTC m=+0.132694119 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:19:25 compute-0 sshd-session[219087]: Failed password for root from 193.46.255.159 port 33362 ssh2
Oct 02 08:19:26 compute-0 nova_compute[192567]: 2025-10-02 08:19:26.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:27 compute-0 podman[219157]: 2025-10-02 08:19:27.177857077 +0000 UTC m=+0.096468463 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct 02 08:19:27 compute-0 nova_compute[192567]: 2025-10-02 08:19:27.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:19:27 compute-0 nova_compute[192567]: 2025-10-02 08:19:27.659 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:19:27 compute-0 nova_compute[192567]: 2025-10-02 08:19:27.660 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:19:27 compute-0 nova_compute[192567]: 2025-10-02 08:19:27.660 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:19:27 compute-0 nova_compute[192567]: 2025-10-02 08:19:27.660 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:19:27 compute-0 nova_compute[192567]: 2025-10-02 08:19:27.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:27 compute-0 nova_compute[192567]: 2025-10-02 08:19:27.761 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:19:27 compute-0 nova_compute[192567]: 2025-10-02 08:19:27.850 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:19:27 compute-0 nova_compute[192567]: 2025-10-02 08:19:27.851 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:19:27 compute-0 sshd-session[219087]: Received disconnect from 193.46.255.159 port 33362:11:  [preauth]
Oct 02 08:19:27 compute-0 sshd-session[219087]: Disconnected from authenticating user root 193.46.255.159 port 33362 [preauth]
Oct 02 08:19:27 compute-0 sshd-session[219087]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Oct 02 08:19:27 compute-0 nova_compute[192567]: 2025-10-02 08:19:27.945 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:19:28 compute-0 nova_compute[192567]: 2025-10-02 08:19:28.197 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:19:28 compute-0 nova_compute[192567]: 2025-10-02 08:19:28.200 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5703MB free_disk=73.43684768676758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:19:28 compute-0 nova_compute[192567]: 2025-10-02 08:19:28.201 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:19:28 compute-0 nova_compute[192567]: 2025-10-02 08:19:28.202 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:19:28 compute-0 nova_compute[192567]: 2025-10-02 08:19:28.293 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance 2992d4a5-e893-4c34-99fc-a7c5455d37f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:19:28 compute-0 nova_compute[192567]: 2025-10-02 08:19:28.293 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:19:28 compute-0 nova_compute[192567]: 2025-10-02 08:19:28.294 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:19:28 compute-0 nova_compute[192567]: 2025-10-02 08:19:28.354 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:19:28 compute-0 nova_compute[192567]: 2025-10-02 08:19:28.375 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:19:28 compute-0 nova_compute[192567]: 2025-10-02 08:19:28.405 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:19:28 compute-0 nova_compute[192567]: 2025-10-02 08:19:28.405 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:19:28 compute-0 unix_chkpwd[219186]: password check failed for user (root)
Oct 02 08:19:28 compute-0 sshd-session[219184]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Oct 02 08:19:29 compute-0 podman[203011]: time="2025-10-02T08:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:19:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:19:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3465 "" "Go-http-client/1.1"
Oct 02 08:19:30 compute-0 podman[219187]: 2025-10-02 08:19:30.155973258 +0000 UTC m=+0.063433245 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:19:30 compute-0 nova_compute[192567]: 2025-10-02 08:19:30.406 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:19:30 compute-0 nova_compute[192567]: 2025-10-02 08:19:30.406 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:19:30 compute-0 nova_compute[192567]: 2025-10-02 08:19:30.407 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:19:30 compute-0 sshd-session[219184]: Failed password for root from 193.46.255.159 port 28418 ssh2
Oct 02 08:19:30 compute-0 nova_compute[192567]: 2025-10-02 08:19:30.873 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-2992d4a5-e893-4c34-99fc-a7c5455d37f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:19:30 compute-0 nova_compute[192567]: 2025-10-02 08:19:30.873 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-2992d4a5-e893-4c34-99fc-a7c5455d37f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:19:30 compute-0 nova_compute[192567]: 2025-10-02 08:19:30.873 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:19:30 compute-0 nova_compute[192567]: 2025-10-02 08:19:30.874 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2992d4a5-e893-4c34-99fc-a7c5455d37f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:19:30 compute-0 unix_chkpwd[219211]: password check failed for user (root)
Oct 02 08:19:31 compute-0 nova_compute[192567]: 2025-10-02 08:19:31.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:31 compute-0 openstack_network_exporter[205118]: ERROR   08:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:19:31 compute-0 openstack_network_exporter[205118]: ERROR   08:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:19:31 compute-0 openstack_network_exporter[205118]: ERROR   08:19:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:19:31 compute-0 openstack_network_exporter[205118]: ERROR   08:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:19:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:19:31 compute-0 openstack_network_exporter[205118]: ERROR   08:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:19:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:19:32 compute-0 nova_compute[192567]: 2025-10-02 08:19:32.137 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Updating instance_info_cache with network_info: [{"id": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "address": "fa:16:3e:ce:cb:87", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90dae37e-3d", "ovs_interfaceid": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:19:32 compute-0 nova_compute[192567]: 2025-10-02 08:19:32.154 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-2992d4a5-e893-4c34-99fc-a7c5455d37f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:19:32 compute-0 nova_compute[192567]: 2025-10-02 08:19:32.155 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:19:32 compute-0 nova_compute[192567]: 2025-10-02 08:19:32.155 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:19:32 compute-0 nova_compute[192567]: 2025-10-02 08:19:32.156 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:19:32 compute-0 nova_compute[192567]: 2025-10-02 08:19:32.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:33 compute-0 sshd-session[219184]: Failed password for root from 193.46.255.159 port 28418 ssh2
Oct 02 08:19:33 compute-0 nova_compute[192567]: 2025-10-02 08:19:33.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:19:33 compute-0 nova_compute[192567]: 2025-10-02 08:19:33.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:19:34 compute-0 nova_compute[192567]: 2025-10-02 08:19:34.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:19:35 compute-0 unix_chkpwd[219212]: password check failed for user (root)
Oct 02 08:19:36 compute-0 nova_compute[192567]: 2025-10-02 08:19:36.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:36 compute-0 sshd-session[219184]: Failed password for root from 193.46.255.159 port 28418 ssh2
Oct 02 08:19:37 compute-0 sshd-session[219184]: Received disconnect from 193.46.255.159 port 28418:11:  [preauth]
Oct 02 08:19:37 compute-0 sshd-session[219184]: Disconnected from authenticating user root 193.46.255.159 port 28418 [preauth]
Oct 02 08:19:37 compute-0 sshd-session[219184]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Oct 02 08:19:37 compute-0 nova_compute[192567]: 2025-10-02 08:19:37.301 2 DEBUG nova.virt.libvirt.driver [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Creating tmpfile /var/lib/nova/instances/tmpp_r2f99o to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Oct 02 08:19:37 compute-0 nova_compute[192567]: 2025-10-02 08:19:37.302 2 DEBUG nova.compute.manager [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp_r2f99o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Oct 02 08:19:37 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 02 08:19:37 compute-0 nova_compute[192567]: 2025-10-02 08:19:37.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:19:37 compute-0 nova_compute[192567]: 2025-10-02 08:19:37.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:19:37 compute-0 nova_compute[192567]: 2025-10-02 08:19:37.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:38 compute-0 nova_compute[192567]: 2025-10-02 08:19:38.431 2 DEBUG nova.compute.manager [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp_r2f99o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='cb60bdfa-c18e-443a-89d3-6173d3d01122',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Oct 02 08:19:38 compute-0 nova_compute[192567]: 2025-10-02 08:19:38.478 2 DEBUG oslo_concurrency.lockutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-cb60bdfa-c18e-443a-89d3-6173d3d01122" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:19:38 compute-0 nova_compute[192567]: 2025-10-02 08:19:38.478 2 DEBUG oslo_concurrency.lockutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-cb60bdfa-c18e-443a-89d3-6173d3d01122" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:19:38 compute-0 nova_compute[192567]: 2025-10-02 08:19:38.479 2 DEBUG nova.network.neutron [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:19:38 compute-0 nova_compute[192567]: 2025-10-02 08:19:38.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:19:39 compute-0 nova_compute[192567]: 2025-10-02 08:19:39.905 2 DEBUG nova.network.neutron [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Updating instance_info_cache with network_info: [{"id": "9f93c26c-cdde-4b08-b999-093d767b4724", "address": "fa:16:3e:90:7f:b3", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f93c26c-cd", "ovs_interfaceid": "9f93c26c-cdde-4b08-b999-093d767b4724", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:19:39 compute-0 nova_compute[192567]: 2025-10-02 08:19:39.925 2 DEBUG oslo_concurrency.lockutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-cb60bdfa-c18e-443a-89d3-6173d3d01122" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:19:39 compute-0 nova_compute[192567]: 2025-10-02 08:19:39.928 2 DEBUG nova.virt.libvirt.driver [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp_r2f99o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='cb60bdfa-c18e-443a-89d3-6173d3d01122',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Oct 02 08:19:39 compute-0 nova_compute[192567]: 2025-10-02 08:19:39.929 2 DEBUG nova.virt.libvirt.driver [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Creating instance directory: /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Oct 02 08:19:39 compute-0 nova_compute[192567]: 2025-10-02 08:19:39.929 2 DEBUG nova.virt.libvirt.driver [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Creating disk.info with the contents: {'/var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk': 'qcow2', '/var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Oct 02 08:19:39 compute-0 nova_compute[192567]: 2025-10-02 08:19:39.930 2 DEBUG nova.virt.libvirt.driver [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Oct 02 08:19:39 compute-0 nova_compute[192567]: 2025-10-02 08:19:39.931 2 DEBUG nova.objects.instance [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid cb60bdfa-c18e-443a-89d3-6173d3d01122 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:19:39 compute-0 nova_compute[192567]: 2025-10-02 08:19:39.970 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.066 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.068 2 DEBUG oslo_concurrency.lockutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.069 2 DEBUG oslo_concurrency.lockutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.094 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.180 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.182 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.215 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.217 2 DEBUG oslo_concurrency.lockutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.218 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.281 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.283 2 DEBUG nova.virt.disk.api [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.283 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.335 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.337 2 DEBUG nova.virt.disk.api [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.338 2 DEBUG nova.objects.instance [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid cb60bdfa-c18e-443a-89d3-6173d3d01122 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.360 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.386 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk.config 485376" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.388 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk.config to /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.388 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk.config /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.966 2 DEBUG oslo_concurrency.processutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122/disk.config /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122" returned: 0 in 0.578s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.967 2 DEBUG nova.virt.libvirt.driver [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.969 2 DEBUG nova.virt.libvirt.vif [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:18:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1519695324',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1519695324',id=11,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:18:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7ac58297e5b44744976c58f773f94090',ramdisk_id='',reservation_id='r-i59iux5h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:18:35Z,user_data=None,user_id='5455cae7258940a8926bef2dc2483570',uuid=cb60bdfa-c18e-443a-89d3-6173d3d01122,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9f93c26c-cdde-4b08-b999-093d767b4724", "address": "fa:16:3e:90:7f:b3", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9f93c26c-cd", "ovs_interfaceid": "9f93c26c-cdde-4b08-b999-093d767b4724", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.969 2 DEBUG nova.network.os_vif_util [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "9f93c26c-cdde-4b08-b999-093d767b4724", "address": "fa:16:3e:90:7f:b3", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9f93c26c-cd", "ovs_interfaceid": "9f93c26c-cdde-4b08-b999-093d767b4724", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.970 2 DEBUG nova.network.os_vif_util [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:90:7f:b3,bridge_name='br-int',has_traffic_filtering=True,id=9f93c26c-cdde-4b08-b999-093d767b4724,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f93c26c-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.970 2 DEBUG os_vif [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:7f:b3,bridge_name='br-int',has_traffic_filtering=True,id=9f93c26c-cdde-4b08-b999-093d767b4724,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f93c26c-cd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.972 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.973 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.976 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f93c26c-cd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.976 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9f93c26c-cd, col_values=(('external_ids', {'iface-id': '9f93c26c-cdde-4b08-b999-093d767b4724', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:90:7f:b3', 'vm-uuid': 'cb60bdfa-c18e-443a-89d3-6173d3d01122'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:40 compute-0 NetworkManager[51654]: <info>  [1759393180.9798] manager: (tap9f93c26c-cd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.987 2 INFO os_vif [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:90:7f:b3,bridge_name='br-int',has_traffic_filtering=True,id=9f93c26c-cdde-4b08-b999-093d767b4724,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f93c26c-cd')
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.988 2 DEBUG nova.virt.libvirt.driver [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Oct 02 08:19:40 compute-0 nova_compute[192567]: 2025-10-02 08:19:40.988 2 DEBUG nova.compute.manager [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp_r2f99o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='cb60bdfa-c18e-443a-89d3-6173d3d01122',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Oct 02 08:19:41 compute-0 nova_compute[192567]: 2025-10-02 08:19:41.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:42 compute-0 nova_compute[192567]: 2025-10-02 08:19:42.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:19:42 compute-0 nova_compute[192567]: 2025-10-02 08:19:42.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:42 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:42.990 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:19:42 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:42.992 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:19:44 compute-0 nova_compute[192567]: 2025-10-02 08:19:44.018 2 DEBUG nova.network.neutron [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Port 9f93c26c-cdde-4b08-b999-093d767b4724 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Oct 02 08:19:44 compute-0 nova_compute[192567]: 2025-10-02 08:19:44.023 2 DEBUG nova.compute.manager [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpp_r2f99o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='cb60bdfa-c18e-443a-89d3-6173d3d01122',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Oct 02 08:19:44 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 02 08:19:44 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 02 08:19:44 compute-0 podman[219253]: 2025-10-02 08:19:44.393725932 +0000 UTC m=+0.099224447 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Oct 02 08:19:44 compute-0 kernel: tap9f93c26c-cd: entered promiscuous mode
Oct 02 08:19:44 compute-0 NetworkManager[51654]: <info>  [1759393184.4523] manager: (tap9f93c26c-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Oct 02 08:19:44 compute-0 ovn_controller[94821]: 2025-10-02T08:19:44Z|00101|binding|INFO|Claiming lport 9f93c26c-cdde-4b08-b999-093d767b4724 for this additional chassis.
Oct 02 08:19:44 compute-0 ovn_controller[94821]: 2025-10-02T08:19:44Z|00102|binding|INFO|9f93c26c-cdde-4b08-b999-093d767b4724: Claiming fa:16:3e:90:7f:b3 10.100.0.4
Oct 02 08:19:44 compute-0 nova_compute[192567]: 2025-10-02 08:19:44.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:44 compute-0 ovn_controller[94821]: 2025-10-02T08:19:44Z|00103|binding|INFO|Setting lport 9f93c26c-cdde-4b08-b999-093d767b4724 ovn-installed in OVS
Oct 02 08:19:44 compute-0 nova_compute[192567]: 2025-10-02 08:19:44.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:44 compute-0 nova_compute[192567]: 2025-10-02 08:19:44.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:44 compute-0 systemd-udevd[219288]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:19:44 compute-0 systemd-machined[152597]: New machine qemu-10-instance-0000000b.
Oct 02 08:19:44 compute-0 NetworkManager[51654]: <info>  [1759393184.5350] device (tap9f93c26c-cd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:19:44 compute-0 NetworkManager[51654]: <info>  [1759393184.5383] device (tap9f93c26c-cd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:19:44 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000b.
Oct 02 08:19:45 compute-0 nova_compute[192567]: 2025-10-02 08:19:45.776 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393185.7750435, cb60bdfa-c18e-443a-89d3-6173d3d01122 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:19:45 compute-0 nova_compute[192567]: 2025-10-02 08:19:45.778 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] VM Started (Lifecycle Event)
Oct 02 08:19:45 compute-0 nova_compute[192567]: 2025-10-02 08:19:45.811 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:19:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:45.977 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:19:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:45.977 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:19:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:45.978 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:19:45 compute-0 nova_compute[192567]: 2025-10-02 08:19:45.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:46 compute-0 nova_compute[192567]: 2025-10-02 08:19:46.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:46 compute-0 nova_compute[192567]: 2025-10-02 08:19:46.738 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393186.7380133, cb60bdfa-c18e-443a-89d3-6173d3d01122 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:19:46 compute-0 nova_compute[192567]: 2025-10-02 08:19:46.738 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] VM Resumed (Lifecycle Event)
Oct 02 08:19:46 compute-0 nova_compute[192567]: 2025-10-02 08:19:46.764 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:19:46 compute-0 nova_compute[192567]: 2025-10-02 08:19:46.768 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:19:46 compute-0 nova_compute[192567]: 2025-10-02 08:19:46.793 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Oct 02 08:19:47 compute-0 ovn_controller[94821]: 2025-10-02T08:19:47Z|00104|binding|INFO|Claiming lport 9f93c26c-cdde-4b08-b999-093d767b4724 for this chassis.
Oct 02 08:19:47 compute-0 ovn_controller[94821]: 2025-10-02T08:19:47Z|00105|binding|INFO|9f93c26c-cdde-4b08-b999-093d767b4724: Claiming fa:16:3e:90:7f:b3 10.100.0.4
Oct 02 08:19:47 compute-0 ovn_controller[94821]: 2025-10-02T08:19:47Z|00106|binding|INFO|Setting lport 9f93c26c-cdde-4b08-b999-093d767b4724 up in Southbound
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.665 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:7f:b3 10.100.0.4'], port_security=['fa:16:3e:90:7f:b3 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'cb60bdfa-c18e-443a-89d3-6173d3d01122', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ac58297e5b44744976c58f773f94090', 'neutron:revision_number': '11', 'neutron:security_group_ids': '92c02662-21d7-4fe7-9c02-e6a0bb798f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=176593bb-df9e-44fd-86b3-56aea7ef157a, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=9f93c26c-cdde-4b08-b999-093d767b4724) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.668 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 9f93c26c-cdde-4b08-b999-093d767b4724 in datapath d2dffba9-387a-40b6-bcfb-049fd17ed68f bound to our chassis
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.671 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d2dffba9-387a-40b6-bcfb-049fd17ed68f
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.695 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[31deb665-bf80-49e1-8cab-1fe84b94412f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.745 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[65b12d1d-a21f-40c0-b134-c116085a1f0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.749 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[50a91aba-9194-4f5a-b8e3-939d1b7c202b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.793 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[3bbff1af-a1e4-492d-89c6-474af4ba317e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:19:47 compute-0 nova_compute[192567]: 2025-10-02 08:19:47.806 2 INFO nova.compute.manager [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Post operation of migration started
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.817 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[6775b4eb-66e6-44ec-b85a-7fd64db1fd78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2dffba9-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:a1:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 5, 'rx_bytes': 1126, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 5, 'rx_bytes': 1126, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401624, 'reachable_time': 37985, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219330, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.840 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[57d403f7-1794-492b-ab48-a6fee45b00f4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd2dffba9-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401643, 'tstamp': 401643}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219331, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd2dffba9-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401648, 'tstamp': 401648}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219331, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.842 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2dffba9-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:19:47 compute-0 nova_compute[192567]: 2025-10-02 08:19:47.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.848 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2dffba9-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.849 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.850 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd2dffba9-30, col_values=(('external_ids', {'iface-id': 'e3ee8aeb-cc58-469b-9f75-ef53474d1d07'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:19:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:47.851 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:19:48 compute-0 nova_compute[192567]: 2025-10-02 08:19:48.224 2 DEBUG oslo_concurrency.lockutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-cb60bdfa-c18e-443a-89d3-6173d3d01122" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:19:48 compute-0 nova_compute[192567]: 2025-10-02 08:19:48.225 2 DEBUG oslo_concurrency.lockutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-cb60bdfa-c18e-443a-89d3-6173d3d01122" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:19:48 compute-0 nova_compute[192567]: 2025-10-02 08:19:48.225 2 DEBUG nova.network.neutron [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:19:50 compute-0 nova_compute[192567]: 2025-10-02 08:19:50.550 2 DEBUG nova.network.neutron [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Updating instance_info_cache with network_info: [{"id": "9f93c26c-cdde-4b08-b999-093d767b4724", "address": "fa:16:3e:90:7f:b3", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f93c26c-cd", "ovs_interfaceid": "9f93c26c-cdde-4b08-b999-093d767b4724", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:19:50 compute-0 nova_compute[192567]: 2025-10-02 08:19:50.578 2 DEBUG oslo_concurrency.lockutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-cb60bdfa-c18e-443a-89d3-6173d3d01122" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:19:50 compute-0 nova_compute[192567]: 2025-10-02 08:19:50.597 2 DEBUG oslo_concurrency.lockutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:19:50 compute-0 nova_compute[192567]: 2025-10-02 08:19:50.598 2 DEBUG oslo_concurrency.lockutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:19:50 compute-0 nova_compute[192567]: 2025-10-02 08:19:50.599 2 DEBUG oslo_concurrency.lockutils [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:19:50 compute-0 nova_compute[192567]: 2025-10-02 08:19:50.606 2 INFO nova.virt.libvirt.driver [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 02 08:19:50 compute-0 virtqemud[192112]: Domain id=10 name='instance-0000000b' uuid=cb60bdfa-c18e-443a-89d3-6173d3d01122 is tainted: custom-monitor
Oct 02 08:19:50 compute-0 nova_compute[192567]: 2025-10-02 08:19:50.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:51 compute-0 nova_compute[192567]: 2025-10-02 08:19:51.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:51 compute-0 nova_compute[192567]: 2025-10-02 08:19:51.615 2 INFO nova.virt.libvirt.driver [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 02 08:19:52 compute-0 nova_compute[192567]: 2025-10-02 08:19:52.622 2 INFO nova.virt.libvirt.driver [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 02 08:19:52 compute-0 nova_compute[192567]: 2025-10-02 08:19:52.629 2 DEBUG nova.compute.manager [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:19:52 compute-0 nova_compute[192567]: 2025-10-02 08:19:52.816 2 DEBUG nova.objects.instance [None req-b2b6eee8-4cb3-4cc2-9487-d95fa97219b7 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Oct 02 08:19:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:52.995 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:19:55 compute-0 nova_compute[192567]: 2025-10-02 08:19:55.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:56 compute-0 podman[219332]: 2025-10-02 08:19:56.208685023 +0000 UTC m=+0.100430957 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 02 08:19:56 compute-0 podman[219334]: 2025-10-02 08:19:56.2208418 +0000 UTC m=+0.105834043 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251001)
Oct 02 08:19:56 compute-0 podman[219333]: 2025-10-02 08:19:56.317225259 +0000 UTC m=+0.204938157 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 02 08:19:56 compute-0 nova_compute[192567]: 2025-10-02 08:19:56.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:58 compute-0 podman[219396]: 2025-10-02 08:19:58.171322091 +0000 UTC m=+0.085996636 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid)
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.648 2 DEBUG oslo_concurrency.lockutils [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.649 2 DEBUG oslo_concurrency.lockutils [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.649 2 DEBUG oslo_concurrency.lockutils [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.649 2 DEBUG oslo_concurrency.lockutils [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.650 2 DEBUG oslo_concurrency.lockutils [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.652 2 INFO nova.compute.manager [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Terminating instance
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.654 2 DEBUG nova.compute.manager [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:19:59 compute-0 kernel: tap90dae37e-3d (unregistering): left promiscuous mode
Oct 02 08:19:59 compute-0 NetworkManager[51654]: <info>  [1759393199.6856] device (tap90dae37e-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:59 compute-0 ovn_controller[94821]: 2025-10-02T08:19:59Z|00107|binding|INFO|Releasing lport 90dae37e-3d1c-4776-99d8-6dc1921fb7ba from this chassis (sb_readonly=0)
Oct 02 08:19:59 compute-0 ovn_controller[94821]: 2025-10-02T08:19:59Z|00108|binding|INFO|Setting lport 90dae37e-3d1c-4776-99d8-6dc1921fb7ba down in Southbound
Oct 02 08:19:59 compute-0 ovn_controller[94821]: 2025-10-02T08:19:59Z|00109|binding|INFO|Removing iface tap90dae37e-3d ovn-installed in OVS
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.708 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:cb:87 10.100.0.8'], port_security=['fa:16:3e:ce:cb:87 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '2992d4a5-e893-4c34-99fc-a7c5455d37f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ac58297e5b44744976c58f773f94090', 'neutron:revision_number': '4', 'neutron:security_group_ids': '92c02662-21d7-4fe7-9c02-e6a0bb798f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=176593bb-df9e-44fd-86b3-56aea7ef157a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=90dae37e-3d1c-4776-99d8-6dc1921fb7ba) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.711 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 90dae37e-3d1c-4776-99d8-6dc1921fb7ba in datapath d2dffba9-387a-40b6-bcfb-049fd17ed68f unbound from our chassis
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.714 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d2dffba9-387a-40b6-bcfb-049fd17ed68f
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:59 compute-0 podman[203011]: time="2025-10-02T08:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:19:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.750 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e17509-6fcb-44eb-b88e-43d4e7a4bab1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:19:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3470 "" "Go-http-client/1.1"
Oct 02 08:19:59 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Oct 02 08:19:59 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Consumed 14.596s CPU time.
Oct 02 08:19:59 compute-0 systemd-machined[152597]: Machine qemu-9-instance-0000000c terminated.
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.791 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a32a7c-345d-4a42-b7d1-1ed2339bd5d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.796 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[763b2f45-713c-4e19-be17-240592e02c2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.838 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[fb7da803-f40b-47c9-bb77-72a89064cf9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.860 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[0c8c4dd6-dc7d-40f5-bd63-bb264ce40439]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2dffba9-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:a1:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401624, 'reachable_time': 37985, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219429, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.891 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[294cbc68-a182-4072-90fb-0e5409d00d22]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd2dffba9-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401643, 'tstamp': 401643}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219431, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd2dffba9-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401648, 'tstamp': 401648}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219431, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.893 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2dffba9-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.903 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2dffba9-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.904 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.904 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd2dffba9-30, col_values=(('external_ids', {'iface-id': 'e3ee8aeb-cc58-469b-9f75-ef53474d1d07'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:19:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:19:59.905 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.936 2 INFO nova.virt.libvirt.driver [-] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Instance destroyed successfully.
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.936 2 DEBUG nova.objects.instance [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lazy-loading 'resources' on Instance uuid 2992d4a5-e893-4c34-99fc-a7c5455d37f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.952 2 DEBUG nova.virt.libvirt.vif [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:18:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-398970798',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-398970798',id=12,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:18:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ac58297e5b44744976c58f773f94090',ramdisk_id='',reservation_id='r-oyqi3b90',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:18:49Z,user_data=None,user_id='5455cae7258940a8926bef2dc2483570',uuid=2992d4a5-e893-4c34-99fc-a7c5455d37f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "address": "fa:16:3e:ce:cb:87", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90dae37e-3d", "ovs_interfaceid": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.953 2 DEBUG nova.network.os_vif_util [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converting VIF {"id": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "address": "fa:16:3e:ce:cb:87", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90dae37e-3d", "ovs_interfaceid": "90dae37e-3d1c-4776-99d8-6dc1921fb7ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.954 2 DEBUG nova.network.os_vif_util [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ce:cb:87,bridge_name='br-int',has_traffic_filtering=True,id=90dae37e-3d1c-4776-99d8-6dc1921fb7ba,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90dae37e-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.954 2 DEBUG os_vif [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:cb:87,bridge_name='br-int',has_traffic_filtering=True,id=90dae37e-3d1c-4776-99d8-6dc1921fb7ba,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90dae37e-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.958 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90dae37e-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.966 2 INFO os_vif [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ce:cb:87,bridge_name='br-int',has_traffic_filtering=True,id=90dae37e-3d1c-4776-99d8-6dc1921fb7ba,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90dae37e-3d')
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.967 2 INFO nova.virt.libvirt.driver [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Deleting instance files /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6_del
Oct 02 08:19:59 compute-0 nova_compute[192567]: 2025-10-02 08:19:59.968 2 INFO nova.virt.libvirt.driver [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Deletion of /var/lib/nova/instances/2992d4a5-e893-4c34-99fc-a7c5455d37f6_del complete
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.030 2 INFO nova.compute.manager [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Took 0.38 seconds to destroy the instance on the hypervisor.
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.031 2 DEBUG oslo.service.loopingcall [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.031 2 DEBUG nova.compute.manager [-] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.031 2 DEBUG nova.network.neutron [-] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.080 2 DEBUG nova.compute.manager [req-762613c4-dfb9-44eb-8c9b-44345aca8f8b req-b25f6417-781a-4733-81ac-239537fce804 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Received event network-vif-unplugged-90dae37e-3d1c-4776-99d8-6dc1921fb7ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.081 2 DEBUG oslo_concurrency.lockutils [req-762613c4-dfb9-44eb-8c9b-44345aca8f8b req-b25f6417-781a-4733-81ac-239537fce804 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.081 2 DEBUG oslo_concurrency.lockutils [req-762613c4-dfb9-44eb-8c9b-44345aca8f8b req-b25f6417-781a-4733-81ac-239537fce804 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.082 2 DEBUG oslo_concurrency.lockutils [req-762613c4-dfb9-44eb-8c9b-44345aca8f8b req-b25f6417-781a-4733-81ac-239537fce804 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.082 2 DEBUG nova.compute.manager [req-762613c4-dfb9-44eb-8c9b-44345aca8f8b req-b25f6417-781a-4733-81ac-239537fce804 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] No waiting events found dispatching network-vif-unplugged-90dae37e-3d1c-4776-99d8-6dc1921fb7ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.082 2 DEBUG nova.compute.manager [req-762613c4-dfb9-44eb-8c9b-44345aca8f8b req-b25f6417-781a-4733-81ac-239537fce804 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Received event network-vif-unplugged-90dae37e-3d1c-4776-99d8-6dc1921fb7ba for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.544 2 DEBUG nova.network.neutron [-] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.560 2 INFO nova.compute.manager [-] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Took 0.53 seconds to deallocate network for instance.
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.603 2 DEBUG oslo_concurrency.lockutils [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.603 2 DEBUG oslo_concurrency.lockutils [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.606 2 DEBUG nova.compute.manager [req-9da9cd7e-1b82-4d8a-8917-5f88351db845 req-48b7d56a-3e51-4c3f-ac02-25db27beaee0 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Received event network-vif-deleted-90dae37e-3d1c-4776-99d8-6dc1921fb7ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.667 2 DEBUG nova.compute.provider_tree [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.685 2 DEBUG nova.scheduler.client.report [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.715 2 DEBUG oslo_concurrency.lockutils [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.737 2 INFO nova.scheduler.client.report [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Deleted allocations for instance 2992d4a5-e893-4c34-99fc-a7c5455d37f6
Oct 02 08:20:00 compute-0 nova_compute[192567]: 2025-10-02 08:20:00.808 2 DEBUG oslo_concurrency.lockutils [None req-b35a76ed-bfb3-46de-9dad-2da408abf984 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:20:01 compute-0 podman[219449]: 2025-10-02 08:20:01.150360791 +0000 UTC m=+0.064630752 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.371 2 DEBUG oslo_concurrency.lockutils [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "cb60bdfa-c18e-443a-89d3-6173d3d01122" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.372 2 DEBUG oslo_concurrency.lockutils [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "cb60bdfa-c18e-443a-89d3-6173d3d01122" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.373 2 DEBUG oslo_concurrency.lockutils [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "cb60bdfa-c18e-443a-89d3-6173d3d01122-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.373 2 DEBUG oslo_concurrency.lockutils [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "cb60bdfa-c18e-443a-89d3-6173d3d01122-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.373 2 DEBUG oslo_concurrency.lockutils [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "cb60bdfa-c18e-443a-89d3-6173d3d01122-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.375 2 INFO nova.compute.manager [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Terminating instance
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.377 2 DEBUG nova.compute.manager [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:20:01 compute-0 kernel: tap9f93c26c-cd (unregistering): left promiscuous mode
Oct 02 08:20:01 compute-0 NetworkManager[51654]: <info>  [1759393201.4052] device (tap9f93c26c-cd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:20:01 compute-0 openstack_network_exporter[205118]: ERROR   08:20:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:20:01 compute-0 openstack_network_exporter[205118]: ERROR   08:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:20:01 compute-0 openstack_network_exporter[205118]: ERROR   08:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:20:01 compute-0 ovn_controller[94821]: 2025-10-02T08:20:01Z|00110|binding|INFO|Releasing lport 9f93c26c-cdde-4b08-b999-093d767b4724 from this chassis (sb_readonly=0)
Oct 02 08:20:01 compute-0 ovn_controller[94821]: 2025-10-02T08:20:01Z|00111|binding|INFO|Setting lport 9f93c26c-cdde-4b08-b999-093d767b4724 down in Southbound
Oct 02 08:20:01 compute-0 ovn_controller[94821]: 2025-10-02T08:20:01Z|00112|binding|INFO|Removing iface tap9f93c26c-cd ovn-installed in OVS
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.434 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:7f:b3 10.100.0.4'], port_security=['fa:16:3e:90:7f:b3 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'cb60bdfa-c18e-443a-89d3-6173d3d01122', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ac58297e5b44744976c58f773f94090', 'neutron:revision_number': '13', 'neutron:security_group_ids': '92c02662-21d7-4fe7-9c02-e6a0bb798f9d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=176593bb-df9e-44fd-86b3-56aea7ef157a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=9f93c26c-cdde-4b08-b999-093d767b4724) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.436 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 9f93c26c-cdde-4b08-b999-093d767b4724 in datapath d2dffba9-387a-40b6-bcfb-049fd17ed68f unbound from our chassis
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.438 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d2dffba9-387a-40b6-bcfb-049fd17ed68f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.439 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f2537bf9-1ae2-4655-b996-cdcb9cae6710]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.440 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f namespace which is not needed anymore
Oct 02 08:20:01 compute-0 openstack_network_exporter[205118]: ERROR   08:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:20:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:20:01 compute-0 openstack_network_exporter[205118]: ERROR   08:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:20:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:01 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Oct 02 08:20:01 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000b.scope: Consumed 2.476s CPU time.
Oct 02 08:20:01 compute-0 systemd-machined[152597]: Machine qemu-10-instance-0000000b terminated.
Oct 02 08:20:01 compute-0 NetworkManager[51654]: <info>  [1759393201.6096] manager: (tap9f93c26c-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Oct 02 08:20:01 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218927]: [NOTICE]   (218932) : haproxy version is 2.8.14-c23fe91
Oct 02 08:20:01 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218927]: [NOTICE]   (218932) : path to executable is /usr/sbin/haproxy
Oct 02 08:20:01 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218927]: [WARNING]  (218932) : Exiting Master process...
Oct 02 08:20:01 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218927]: [ALERT]    (218932) : Current worker (218934) exited with code 143 (Terminated)
Oct 02 08:20:01 compute-0 neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f[218927]: [WARNING]  (218932) : All workers exited. Exiting... (0)
Oct 02 08:20:01 compute-0 systemd[1]: libpod-6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52.scope: Deactivated successfully.
Oct 02 08:20:01 compute-0 podman[219496]: 2025-10-02 08:20:01.628011771 +0000 UTC m=+0.062420273 container died 6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct 02 08:20:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52-userdata-shm.mount: Deactivated successfully.
Oct 02 08:20:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-85176e27ed0370be109e2f38d9165906ba2da6858a117c79238bb0558d37261d-merged.mount: Deactivated successfully.
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.680 2 INFO nova.virt.libvirt.driver [-] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Instance destroyed successfully.
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.681 2 DEBUG nova.objects.instance [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lazy-loading 'resources' on Instance uuid cb60bdfa-c18e-443a-89d3-6173d3d01122 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:20:01 compute-0 podman[219496]: 2025-10-02 08:20:01.692432504 +0000 UTC m=+0.126841046 container cleanup 6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.693 2 DEBUG nova.virt.libvirt.vif [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T08:18:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1519695324',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1519695324',id=11,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:18:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ac58297e5b44744976c58f773f94090',ramdisk_id='',reservation_id='r-i59iux5h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',clean_attempts='1',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1763362073-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:19:52Z,user_data=None,user_id='5455cae7258940a8926bef2dc2483570',uuid=cb60bdfa-c18e-443a-89d3-6173d3d01122,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9f93c26c-cdde-4b08-b999-093d767b4724", "address": "fa:16:3e:90:7f:b3", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f93c26c-cd", "ovs_interfaceid": "9f93c26c-cdde-4b08-b999-093d767b4724", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.693 2 DEBUG nova.network.os_vif_util [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converting VIF {"id": "9f93c26c-cdde-4b08-b999-093d767b4724", "address": "fa:16:3e:90:7f:b3", "network": {"id": "d2dffba9-387a-40b6-bcfb-049fd17ed68f", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1233296972-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a11c5c8c7f7b443889fb949a076c8815", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f93c26c-cd", "ovs_interfaceid": "9f93c26c-cdde-4b08-b999-093d767b4724", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.694 2 DEBUG nova.network.os_vif_util [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:90:7f:b3,bridge_name='br-int',has_traffic_filtering=True,id=9f93c26c-cdde-4b08-b999-093d767b4724,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f93c26c-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.694 2 DEBUG os_vif [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:7f:b3,bridge_name='br-int',has_traffic_filtering=True,id=9f93c26c-cdde-4b08-b999-093d767b4724,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f93c26c-cd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.697 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f93c26c-cd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:20:01 compute-0 systemd[1]: libpod-conmon-6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52.scope: Deactivated successfully.
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.706 2 INFO os_vif [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:90:7f:b3,bridge_name='br-int',has_traffic_filtering=True,id=9f93c26c-cdde-4b08-b999-093d767b4724,network=Network(d2dffba9-387a-40b6-bcfb-049fd17ed68f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f93c26c-cd')
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.706 2 INFO nova.virt.libvirt.driver [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Deleting instance files /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122_del
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.707 2 INFO nova.virt.libvirt.driver [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Deletion of /var/lib/nova/instances/cb60bdfa-c18e-443a-89d3-6173d3d01122_del complete
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.747 2 INFO nova.compute.manager [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Took 0.37 seconds to destroy the instance on the hypervisor.
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.747 2 DEBUG oslo.service.loopingcall [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.748 2 DEBUG nova.compute.manager [-] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.748 2 DEBUG nova.network.neutron [-] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:20:01 compute-0 podman[219542]: 2025-10-02 08:20:01.79512632 +0000 UTC m=+0.061293549 container remove 6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.801 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[61c6291c-9384-4ba9-ab47-5dedbd1aa4b5]: (4, ('Thu Oct  2 08:20:01 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f (6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52)\n6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52\nThu Oct  2 08:20:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f (6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52)\n6b216b68afc5e0018719290460166533455c945b03c8f73c6b7bf85afcf31c52\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.803 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8b6a615a-6c93-4f56-8509-c01af3762596]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.804 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2dffba9-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:20:01 compute-0 kernel: tapd2dffba9-30: left promiscuous mode
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:01 compute-0 nova_compute[192567]: 2025-10-02 08:20:01.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.826 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[4d247e62-774a-4134-87bb-c5b6e4211977]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.856 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[0466d684-e697-48fa-b3a9-9c928901931a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.857 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[71015b32-8c8d-4f75-bc31-5ab309c77f20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.884 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[1d09f646-21fb-4a27-89c7-0d5d72e5b1d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401610, 'reachable_time': 42384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219558, 'error': None, 'target': 'ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.891 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d2dffba9-387a-40b6-bcfb-049fd17ed68f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:20:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:01.892 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[83b7f1f6-c621-494a-8944-5d475c14ab20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:20:01 compute-0 systemd[1]: run-netns-ovnmeta\x2dd2dffba9\x2d387a\x2d40b6\x2dbcfb\x2d049fd17ed68f.mount: Deactivated successfully.
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.171 2 DEBUG nova.compute.manager [req-e6718e7d-6296-48c2-922f-bfd728da070c req-d5e9eebc-104d-4dd4-a6e3-d69fdd566f44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Received event network-vif-plugged-90dae37e-3d1c-4776-99d8-6dc1921fb7ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.171 2 DEBUG oslo_concurrency.lockutils [req-e6718e7d-6296-48c2-922f-bfd728da070c req-d5e9eebc-104d-4dd4-a6e3-d69fdd566f44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.172 2 DEBUG oslo_concurrency.lockutils [req-e6718e7d-6296-48c2-922f-bfd728da070c req-d5e9eebc-104d-4dd4-a6e3-d69fdd566f44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.173 2 DEBUG oslo_concurrency.lockutils [req-e6718e7d-6296-48c2-922f-bfd728da070c req-d5e9eebc-104d-4dd4-a6e3-d69fdd566f44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "2992d4a5-e893-4c34-99fc-a7c5455d37f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.173 2 DEBUG nova.compute.manager [req-e6718e7d-6296-48c2-922f-bfd728da070c req-d5e9eebc-104d-4dd4-a6e3-d69fdd566f44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] No waiting events found dispatching network-vif-plugged-90dae37e-3d1c-4776-99d8-6dc1921fb7ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.174 2 WARNING nova.compute.manager [req-e6718e7d-6296-48c2-922f-bfd728da070c req-d5e9eebc-104d-4dd4-a6e3-d69fdd566f44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Received unexpected event network-vif-plugged-90dae37e-3d1c-4776-99d8-6dc1921fb7ba for instance with vm_state deleted and task_state None.
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.174 2 DEBUG nova.compute.manager [req-e6718e7d-6296-48c2-922f-bfd728da070c req-d5e9eebc-104d-4dd4-a6e3-d69fdd566f44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Received event network-vif-unplugged-9f93c26c-cdde-4b08-b999-093d767b4724 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.175 2 DEBUG oslo_concurrency.lockutils [req-e6718e7d-6296-48c2-922f-bfd728da070c req-d5e9eebc-104d-4dd4-a6e3-d69fdd566f44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "cb60bdfa-c18e-443a-89d3-6173d3d01122-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.175 2 DEBUG oslo_concurrency.lockutils [req-e6718e7d-6296-48c2-922f-bfd728da070c req-d5e9eebc-104d-4dd4-a6e3-d69fdd566f44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "cb60bdfa-c18e-443a-89d3-6173d3d01122-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.176 2 DEBUG oslo_concurrency.lockutils [req-e6718e7d-6296-48c2-922f-bfd728da070c req-d5e9eebc-104d-4dd4-a6e3-d69fdd566f44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "cb60bdfa-c18e-443a-89d3-6173d3d01122-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.176 2 DEBUG nova.compute.manager [req-e6718e7d-6296-48c2-922f-bfd728da070c req-d5e9eebc-104d-4dd4-a6e3-d69fdd566f44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] No waiting events found dispatching network-vif-unplugged-9f93c26c-cdde-4b08-b999-093d767b4724 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.177 2 DEBUG nova.compute.manager [req-e6718e7d-6296-48c2-922f-bfd728da070c req-d5e9eebc-104d-4dd4-a6e3-d69fdd566f44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Received event network-vif-unplugged-9f93c26c-cdde-4b08-b999-093d767b4724 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.226 2 DEBUG nova.network.neutron [-] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.242 2 INFO nova.compute.manager [-] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Took 0.49 seconds to deallocate network for instance.
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.291 2 DEBUG oslo_concurrency.lockutils [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.291 2 DEBUG oslo_concurrency.lockutils [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.299 2 DEBUG oslo_concurrency.lockutils [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.321 2 INFO nova.scheduler.client.report [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Deleted allocations for instance cb60bdfa-c18e-443a-89d3-6173d3d01122
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.403 2 DEBUG oslo_concurrency.lockutils [None req-e731ebd8-cfa5-4ea1-879a-28833b8df6d1 5455cae7258940a8926bef2dc2483570 7ac58297e5b44744976c58f773f94090 - - default default] Lock "cb60bdfa-c18e-443a-89d3-6173d3d01122" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:20:02 compute-0 nova_compute[192567]: 2025-10-02 08:20:02.676 2 DEBUG nova.compute.manager [req-35bd970c-a1d5-4365-8b3e-64eb3d859fd4 req-de16d5a1-dc16-4e0f-8eb3-a86f19212667 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Received event network-vif-deleted-9f93c26c-cdde-4b08-b999-093d767b4724 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:20:04 compute-0 nova_compute[192567]: 2025-10-02 08:20:04.260 2 DEBUG nova.compute.manager [req-f0cfb247-b4d2-4668-8e73-0364e7027744 req-b5440162-5ec7-4013-b759-92a752a6af31 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Received event network-vif-plugged-9f93c26c-cdde-4b08-b999-093d767b4724 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:20:04 compute-0 nova_compute[192567]: 2025-10-02 08:20:04.260 2 DEBUG oslo_concurrency.lockutils [req-f0cfb247-b4d2-4668-8e73-0364e7027744 req-b5440162-5ec7-4013-b759-92a752a6af31 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "cb60bdfa-c18e-443a-89d3-6173d3d01122-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:20:04 compute-0 nova_compute[192567]: 2025-10-02 08:20:04.261 2 DEBUG oslo_concurrency.lockutils [req-f0cfb247-b4d2-4668-8e73-0364e7027744 req-b5440162-5ec7-4013-b759-92a752a6af31 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "cb60bdfa-c18e-443a-89d3-6173d3d01122-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:20:04 compute-0 nova_compute[192567]: 2025-10-02 08:20:04.261 2 DEBUG oslo_concurrency.lockutils [req-f0cfb247-b4d2-4668-8e73-0364e7027744 req-b5440162-5ec7-4013-b759-92a752a6af31 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "cb60bdfa-c18e-443a-89d3-6173d3d01122-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:20:04 compute-0 nova_compute[192567]: 2025-10-02 08:20:04.261 2 DEBUG nova.compute.manager [req-f0cfb247-b4d2-4668-8e73-0364e7027744 req-b5440162-5ec7-4013-b759-92a752a6af31 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] No waiting events found dispatching network-vif-plugged-9f93c26c-cdde-4b08-b999-093d767b4724 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:20:04 compute-0 nova_compute[192567]: 2025-10-02 08:20:04.262 2 WARNING nova.compute.manager [req-f0cfb247-b4d2-4668-8e73-0364e7027744 req-b5440162-5ec7-4013-b759-92a752a6af31 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Received unexpected event network-vif-plugged-9f93c26c-cdde-4b08-b999-093d767b4724 for instance with vm_state deleted and task_state None.
Oct 02 08:20:06 compute-0 nova_compute[192567]: 2025-10-02 08:20:06.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:06 compute-0 nova_compute[192567]: 2025-10-02 08:20:06.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:11 compute-0 nova_compute[192567]: 2025-10-02 08:20:11.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:11 compute-0 nova_compute[192567]: 2025-10-02 08:20:11.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:14 compute-0 nova_compute[192567]: 2025-10-02 08:20:14.934 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759393199.9330924, 2992d4a5-e893-4c34-99fc-a7c5455d37f6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:20:14 compute-0 nova_compute[192567]: 2025-10-02 08:20:14.935 2 INFO nova.compute.manager [-] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] VM Stopped (Lifecycle Event)
Oct 02 08:20:14 compute-0 nova_compute[192567]: 2025-10-02 08:20:14.959 2 DEBUG nova.compute.manager [None req-ec47338a-7078-4eb2-b7db-78959edd0a37 - - - - - -] [instance: 2992d4a5-e893-4c34-99fc-a7c5455d37f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:20:15 compute-0 podman[219560]: 2025-10-02 08:20:15.216122274 +0000 UTC m=+0.120640004 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Oct 02 08:20:16 compute-0 nova_compute[192567]: 2025-10-02 08:20:16.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:16 compute-0 nova_compute[192567]: 2025-10-02 08:20:16.676 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759393201.6742728, cb60bdfa-c18e-443a-89d3-6173d3d01122 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:20:16 compute-0 nova_compute[192567]: 2025-10-02 08:20:16.676 2 INFO nova.compute.manager [-] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] VM Stopped (Lifecycle Event)
Oct 02 08:20:16 compute-0 nova_compute[192567]: 2025-10-02 08:20:16.699 2 DEBUG nova.compute.manager [None req-221cdecb-b112-4358-8bd1-9fbeb5c666cc - - - - - -] [instance: cb60bdfa-c18e-443a-89d3-6173d3d01122] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:20:16 compute-0 nova_compute[192567]: 2025-10-02 08:20:16.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:21 compute-0 nova_compute[192567]: 2025-10-02 08:20:21.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:21 compute-0 nova_compute[192567]: 2025-10-02 08:20:21.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:26 compute-0 nova_compute[192567]: 2025-10-02 08:20:26.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:26 compute-0 nova_compute[192567]: 2025-10-02 08:20:26.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:27 compute-0 nova_compute[192567]: 2025-10-02 08:20:27.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:27 compute-0 podman[219585]: 2025-10-02 08:20:27.221990373 +0000 UTC m=+0.089560228 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 02 08:20:27 compute-0 podman[219583]: 2025-10-02 08:20:27.22253313 +0000 UTC m=+0.094423849 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 02 08:20:27 compute-0 podman[219584]: 2025-10-02 08:20:27.272415662 +0000 UTC m=+0.142849895 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251001)
Oct 02 08:20:28 compute-0 nova_compute[192567]: 2025-10-02 08:20:28.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:20:28 compute-0 nova_compute[192567]: 2025-10-02 08:20:28.658 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:20:28 compute-0 nova_compute[192567]: 2025-10-02 08:20:28.659 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:20:28 compute-0 nova_compute[192567]: 2025-10-02 08:20:28.660 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:20:28 compute-0 nova_compute[192567]: 2025-10-02 08:20:28.660 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:20:28 compute-0 podman[219644]: 2025-10-02 08:20:28.844408137 +0000 UTC m=+0.117697953 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 02 08:20:28 compute-0 nova_compute[192567]: 2025-10-02 08:20:28.897 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:20:28 compute-0 nova_compute[192567]: 2025-10-02 08:20:28.899 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5886MB free_disk=73.46564865112305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:20:28 compute-0 nova_compute[192567]: 2025-10-02 08:20:28.899 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:20:28 compute-0 nova_compute[192567]: 2025-10-02 08:20:28.899 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:20:28 compute-0 nova_compute[192567]: 2025-10-02 08:20:28.977 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:20:28 compute-0 nova_compute[192567]: 2025-10-02 08:20:28.978 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:20:29 compute-0 nova_compute[192567]: 2025-10-02 08:20:29.010 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing inventories for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 02 08:20:29 compute-0 nova_compute[192567]: 2025-10-02 08:20:29.034 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating ProviderTree inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 02 08:20:29 compute-0 nova_compute[192567]: 2025-10-02 08:20:29.034 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:20:29 compute-0 nova_compute[192567]: 2025-10-02 08:20:29.053 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing aggregate associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 02 08:20:29 compute-0 nova_compute[192567]: 2025-10-02 08:20:29.088 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing trait associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 02 08:20:29 compute-0 nova_compute[192567]: 2025-10-02 08:20:29.119 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:20:29 compute-0 nova_compute[192567]: 2025-10-02 08:20:29.133 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:20:29 compute-0 nova_compute[192567]: 2025-10-02 08:20:29.162 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:20:29 compute-0 nova_compute[192567]: 2025-10-02 08:20:29.163 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:20:29 compute-0 podman[203011]: time="2025-10-02T08:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:20:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:20:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3002 "" "Go-http-client/1.1"
Oct 02 08:20:31 compute-0 openstack_network_exporter[205118]: ERROR   08:20:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:20:31 compute-0 openstack_network_exporter[205118]: ERROR   08:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:20:31 compute-0 openstack_network_exporter[205118]: ERROR   08:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:20:31 compute-0 openstack_network_exporter[205118]: ERROR   08:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:20:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:20:31 compute-0 openstack_network_exporter[205118]: ERROR   08:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:20:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:20:31 compute-0 nova_compute[192567]: 2025-10-02 08:20:31.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:31 compute-0 nova_compute[192567]: 2025-10-02 08:20:31.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:32 compute-0 nova_compute[192567]: 2025-10-02 08:20:32.163 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:20:32 compute-0 nova_compute[192567]: 2025-10-02 08:20:32.164 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:20:32 compute-0 nova_compute[192567]: 2025-10-02 08:20:32.164 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:20:32 compute-0 podman[219664]: 2025-10-02 08:20:32.169480392 +0000 UTC m=+0.078544135 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:20:32 compute-0 nova_compute[192567]: 2025-10-02 08:20:32.185 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:20:32 compute-0 nova_compute[192567]: 2025-10-02 08:20:32.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:20:32 compute-0 nova_compute[192567]: 2025-10-02 08:20:32.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:20:33 compute-0 nova_compute[192567]: 2025-10-02 08:20:33.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:20:34 compute-0 nova_compute[192567]: 2025-10-02 08:20:34.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:20:35 compute-0 nova_compute[192567]: 2025-10-02 08:20:35.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:20:36 compute-0 nova_compute[192567]: 2025-10-02 08:20:36.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:36 compute-0 nova_compute[192567]: 2025-10-02 08:20:36.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:39 compute-0 nova_compute[192567]: 2025-10-02 08:20:39.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:20:39 compute-0 nova_compute[192567]: 2025-10-02 08:20:39.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:20:41 compute-0 nova_compute[192567]: 2025-10-02 08:20:41.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:41 compute-0 nova_compute[192567]: 2025-10-02 08:20:41.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:43 compute-0 nova_compute[192567]: 2025-10-02 08:20:43.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:20:43 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:43.888 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:20:43 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:43.890 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:20:43 compute-0 nova_compute[192567]: 2025-10-02 08:20:43.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:45.977 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:20:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:45.978 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:20:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:45.979 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:20:46 compute-0 podman[219688]: 2025-10-02 08:20:46.200463082 +0000 UTC m=+0.111462119 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 02 08:20:46 compute-0 nova_compute[192567]: 2025-10-02 08:20:46.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:46 compute-0 nova_compute[192567]: 2025-10-02 08:20:46.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:20:49.892 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:20:51 compute-0 nova_compute[192567]: 2025-10-02 08:20:51.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:51 compute-0 nova_compute[192567]: 2025-10-02 08:20:51.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:56 compute-0 nova_compute[192567]: 2025-10-02 08:20:56.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:56 compute-0 nova_compute[192567]: 2025-10-02 08:20:56.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:20:58 compute-0 podman[219710]: 2025-10-02 08:20:58.189537429 +0000 UTC m=+0.090874828 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 02 08:20:58 compute-0 podman[219712]: 2025-10-02 08:20:58.205756774 +0000 UTC m=+0.099335832 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, container_name=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:20:58 compute-0 podman[219711]: 2025-10-02 08:20:58.238800682 +0000 UTC m=+0.133852435 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Oct 02 08:20:59 compute-0 podman[219773]: 2025-10-02 08:20:59.169940071 +0000 UTC m=+0.074466928 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 02 08:20:59 compute-0 podman[203011]: time="2025-10-02T08:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:20:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:20:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3008 "" "Go-http-client/1.1"
Oct 02 08:21:01 compute-0 openstack_network_exporter[205118]: ERROR   08:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:21:01 compute-0 openstack_network_exporter[205118]: ERROR   08:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:21:01 compute-0 openstack_network_exporter[205118]: ERROR   08:21:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:21:01 compute-0 openstack_network_exporter[205118]: ERROR   08:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:21:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:21:01 compute-0 openstack_network_exporter[205118]: ERROR   08:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:21:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:21:01 compute-0 nova_compute[192567]: 2025-10-02 08:21:01.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:01 compute-0 nova_compute[192567]: 2025-10-02 08:21:01.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:03 compute-0 podman[219793]: 2025-10-02 08:21:03.162609674 +0000 UTC m=+0.073084835 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:21:05 compute-0 ovn_controller[94821]: 2025-10-02T08:21:05Z|00113|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 02 08:21:06 compute-0 nova_compute[192567]: 2025-10-02 08:21:06.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:06 compute-0 nova_compute[192567]: 2025-10-02 08:21:06.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:11 compute-0 nova_compute[192567]: 2025-10-02 08:21:11.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:11 compute-0 nova_compute[192567]: 2025-10-02 08:21:11.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:16 compute-0 nova_compute[192567]: 2025-10-02 08:21:16.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:16 compute-0 nova_compute[192567]: 2025-10-02 08:21:16.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:17 compute-0 podman[219817]: 2025-10-02 08:21:17.150998691 +0000 UTC m=+0.061598667 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 02 08:21:21 compute-0 nova_compute[192567]: 2025-10-02 08:21:21.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:21 compute-0 nova_compute[192567]: 2025-10-02 08:21:21.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:26 compute-0 nova_compute[192567]: 2025-10-02 08:21:26.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:26 compute-0 nova_compute[192567]: 2025-10-02 08:21:26.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:29 compute-0 podman[219839]: 2025-10-02 08:21:29.189342978 +0000 UTC m=+0.095147565 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 02 08:21:29 compute-0 podman[219841]: 2025-10-02 08:21:29.222746401 +0000 UTC m=+0.123105380 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 02 08:21:29 compute-0 podman[219840]: 2025-10-02 08:21:29.241260804 +0000 UTC m=+0.140758446 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:21:29 compute-0 podman[219894]: 2025-10-02 08:21:29.316674858 +0000 UTC m=+0.084673652 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:21:29 compute-0 nova_compute[192567]: 2025-10-02 08:21:29.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:29 compute-0 nova_compute[192567]: 2025-10-02 08:21:29.647 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:21:29 compute-0 nova_compute[192567]: 2025-10-02 08:21:29.648 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:21:29 compute-0 nova_compute[192567]: 2025-10-02 08:21:29.648 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:21:29 compute-0 nova_compute[192567]: 2025-10-02 08:21:29.648 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:21:29 compute-0 podman[203011]: time="2025-10-02T08:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:21:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:21:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3003 "" "Go-http-client/1.1"
Oct 02 08:21:29 compute-0 nova_compute[192567]: 2025-10-02 08:21:29.852 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:21:29 compute-0 nova_compute[192567]: 2025-10-02 08:21:29.854 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5883MB free_disk=73.46562957763672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:21:29 compute-0 nova_compute[192567]: 2025-10-02 08:21:29.854 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:21:29 compute-0 nova_compute[192567]: 2025-10-02 08:21:29.854 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:21:29 compute-0 nova_compute[192567]: 2025-10-02 08:21:29.926 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:21:29 compute-0 nova_compute[192567]: 2025-10-02 08:21:29.927 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:21:30 compute-0 nova_compute[192567]: 2025-10-02 08:21:30.036 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:21:30 compute-0 nova_compute[192567]: 2025-10-02 08:21:30.057 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:21:30 compute-0 nova_compute[192567]: 2025-10-02 08:21:30.060 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:21:30 compute-0 nova_compute[192567]: 2025-10-02 08:21:30.061 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:21:30 compute-0 nova_compute[192567]: 2025-10-02 08:21:30.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:30 compute-0 nova_compute[192567]: 2025-10-02 08:21:30.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 02 08:21:30 compute-0 nova_compute[192567]: 2025-10-02 08:21:30.648 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 02 08:21:30 compute-0 nova_compute[192567]: 2025-10-02 08:21:30.648 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:30 compute-0 nova_compute[192567]: 2025-10-02 08:21:30.649 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 02 08:21:31 compute-0 openstack_network_exporter[205118]: ERROR   08:21:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:21:31 compute-0 openstack_network_exporter[205118]: ERROR   08:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:21:31 compute-0 openstack_network_exporter[205118]: ERROR   08:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:21:31 compute-0 openstack_network_exporter[205118]: ERROR   08:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:21:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:21:31 compute-0 openstack_network_exporter[205118]: ERROR   08:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:21:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:21:31 compute-0 nova_compute[192567]: 2025-10-02 08:21:31.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:31 compute-0 nova_compute[192567]: 2025-10-02 08:21:31.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:32 compute-0 nova_compute[192567]: 2025-10-02 08:21:32.663 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:32 compute-0 nova_compute[192567]: 2025-10-02 08:21:32.664 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:21:32 compute-0 nova_compute[192567]: 2025-10-02 08:21:32.664 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:21:32 compute-0 nova_compute[192567]: 2025-10-02 08:21:32.682 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:21:33 compute-0 nova_compute[192567]: 2025-10-02 08:21:33.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:34 compute-0 podman[219923]: 2025-10-02 08:21:34.157415688 +0000 UTC m=+0.069670357 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:21:34 compute-0 nova_compute[192567]: 2025-10-02 08:21:34.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:35 compute-0 nova_compute[192567]: 2025-10-02 08:21:35.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:35 compute-0 nova_compute[192567]: 2025-10-02 08:21:35.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:36 compute-0 nova_compute[192567]: 2025-10-02 08:21:36.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:36 compute-0 nova_compute[192567]: 2025-10-02 08:21:36.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:36 compute-0 nova_compute[192567]: 2025-10-02 08:21:36.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:39 compute-0 nova_compute[192567]: 2025-10-02 08:21:39.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:40 compute-0 nova_compute[192567]: 2025-10-02 08:21:40.634 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:40 compute-0 nova_compute[192567]: 2025-10-02 08:21:40.658 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:40 compute-0 nova_compute[192567]: 2025-10-02 08:21:40.659 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:21:41 compute-0 nova_compute[192567]: 2025-10-02 08:21:41.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:41 compute-0 nova_compute[192567]: 2025-10-02 08:21:41.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:43 compute-0 nova_compute[192567]: 2025-10-02 08:21:43.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:21:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:45.978 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:21:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:45.979 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:21:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:45.979 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:21:46 compute-0 nova_compute[192567]: 2025-10-02 08:21:46.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:46 compute-0 nova_compute[192567]: 2025-10-02 08:21:46.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:47 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 02 08:21:47 compute-0 podman[219948]: 2025-10-02 08:21:47.315987501 +0000 UTC m=+0.097549078 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.275 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.275 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.296 2 DEBUG nova.compute.manager [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.395 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.396 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.404 2 DEBUG nova.virt.hardware [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.405 2 INFO nova.compute.claims [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.541 2 DEBUG nova.compute.provider_tree [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.559 2 DEBUG nova.scheduler.client.report [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.585 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.586 2 DEBUG nova.compute.manager [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.650 2 DEBUG nova.compute.manager [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.651 2 DEBUG nova.network.neutron [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.676 2 INFO nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.700 2 DEBUG nova.compute.manager [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.795 2 DEBUG nova.compute.manager [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.797 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.797 2 INFO nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Creating image(s)
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.798 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "/var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.799 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "/var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.800 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "/var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.825 2 DEBUG oslo_concurrency.processutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.924 2 DEBUG oslo_concurrency.processutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.926 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.927 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:21:49 compute-0 nova_compute[192567]: 2025-10-02 08:21:49.948 2 DEBUG oslo_concurrency.processutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.033 2 DEBUG oslo_concurrency.processutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.035 2 DEBUG oslo_concurrency.processutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.085 2 DEBUG oslo_concurrency.processutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.087 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.088 2 DEBUG oslo_concurrency.processutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.149 2 DEBUG oslo_concurrency.processutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.150 2 DEBUG nova.virt.disk.api [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Checking if we can resize image /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.151 2 DEBUG oslo_concurrency.processutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.213 2 DEBUG oslo_concurrency.processutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.215 2 DEBUG nova.virt.disk.api [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Cannot resize image /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.215 2 DEBUG nova.objects.instance [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'migration_context' on Instance uuid 4215f34f-2fe2-47bc-8a97-5ebd0d9de473 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.233 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.234 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Ensure instance console log exists: /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.235 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.235 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.236 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:21:50 compute-0 nova_compute[192567]: 2025-10-02 08:21:50.358 2 DEBUG nova.network.neutron [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Successfully created port: 9da9e2f3-1dbb-49c3-93ba-a8284319d5da _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 02 08:21:51 compute-0 nova_compute[192567]: 2025-10-02 08:21:51.078 2 DEBUG nova.network.neutron [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Successfully updated port: 9da9e2f3-1dbb-49c3-93ba-a8284319d5da _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 02 08:21:51 compute-0 nova_compute[192567]: 2025-10-02 08:21:51.096 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "refresh_cache-4215f34f-2fe2-47bc-8a97-5ebd0d9de473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:21:51 compute-0 nova_compute[192567]: 2025-10-02 08:21:51.097 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquired lock "refresh_cache-4215f34f-2fe2-47bc-8a97-5ebd0d9de473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:21:51 compute-0 nova_compute[192567]: 2025-10-02 08:21:51.097 2 DEBUG nova.network.neutron [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:21:51 compute-0 nova_compute[192567]: 2025-10-02 08:21:51.175 2 DEBUG nova.compute.manager [req-1b23e2f7-2751-424a-9d9f-eb87545df935 req-56166a26-c992-4e93-b4ad-5ab834d4b2ce 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-changed-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:21:51 compute-0 nova_compute[192567]: 2025-10-02 08:21:51.176 2 DEBUG nova.compute.manager [req-1b23e2f7-2751-424a-9d9f-eb87545df935 req-56166a26-c992-4e93-b4ad-5ab834d4b2ce 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Refreshing instance network info cache due to event network-changed-9da9e2f3-1dbb-49c3-93ba-a8284319d5da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:21:51 compute-0 nova_compute[192567]: 2025-10-02 08:21:51.177 2 DEBUG oslo_concurrency.lockutils [req-1b23e2f7-2751-424a-9d9f-eb87545df935 req-56166a26-c992-4e93-b4ad-5ab834d4b2ce 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-4215f34f-2fe2-47bc-8a97-5ebd0d9de473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:21:51 compute-0 nova_compute[192567]: 2025-10-02 08:21:51.297 2 DEBUG nova.network.neutron [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:21:51 compute-0 nova_compute[192567]: 2025-10-02 08:21:51.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:51 compute-0 nova_compute[192567]: 2025-10-02 08:21:51.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:52 compute-0 nova_compute[192567]: 2025-10-02 08:21:52.959 2 DEBUG nova.network.neutron [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Updating instance_info_cache with network_info: [{"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:21:52 compute-0 nova_compute[192567]: 2025-10-02 08:21:52.995 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Releasing lock "refresh_cache-4215f34f-2fe2-47bc-8a97-5ebd0d9de473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:21:52 compute-0 nova_compute[192567]: 2025-10-02 08:21:52.996 2 DEBUG nova.compute.manager [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Instance network_info: |[{"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 02 08:21:52 compute-0 nova_compute[192567]: 2025-10-02 08:21:52.996 2 DEBUG oslo_concurrency.lockutils [req-1b23e2f7-2751-424a-9d9f-eb87545df935 req-56166a26-c992-4e93-b4ad-5ab834d4b2ce 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-4215f34f-2fe2-47bc-8a97-5ebd0d9de473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:21:52 compute-0 nova_compute[192567]: 2025-10-02 08:21:52.997 2 DEBUG nova.network.neutron [req-1b23e2f7-2751-424a-9d9f-eb87545df935 req-56166a26-c992-4e93-b4ad-5ab834d4b2ce 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Refreshing network info cache for port 9da9e2f3-1dbb-49c3-93ba-a8284319d5da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.002 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Start _get_guest_xml network_info=[{"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.008 2 WARNING nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.015 2 DEBUG nova.virt.libvirt.host [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.016 2 DEBUG nova.virt.libvirt.host [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.020 2 DEBUG nova.virt.libvirt.host [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.021 2 DEBUG nova.virt.libvirt.host [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.022 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.022 2 DEBUG nova.virt.hardware [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.023 2 DEBUG nova.virt.hardware [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.024 2 DEBUG nova.virt.hardware [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.024 2 DEBUG nova.virt.hardware [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.024 2 DEBUG nova.virt.hardware [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.025 2 DEBUG nova.virt.hardware [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.025 2 DEBUG nova.virt.hardware [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.026 2 DEBUG nova.virt.hardware [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.026 2 DEBUG nova.virt.hardware [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.026 2 DEBUG nova.virt.hardware [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.027 2 DEBUG nova.virt.hardware [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.033 2 DEBUG nova.virt.libvirt.vif [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:21:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-178937293',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-178937293',id=14,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-au3gen8e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:21:49Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=4215f34f-2fe2-47bc-8a97-5ebd0d9de473,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.034 2 DEBUG nova.network.os_vif_util [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.035 2 DEBUG nova.network.os_vif_util [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:f0:2b,bridge_name='br-int',has_traffic_filtering=True,id=9da9e2f3-1dbb-49c3-93ba-a8284319d5da,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9da9e2f3-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.037 2 DEBUG nova.objects.instance [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'pci_devices' on Instance uuid 4215f34f-2fe2-47bc-8a97-5ebd0d9de473 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.056 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:21:53 compute-0 nova_compute[192567]:   <uuid>4215f34f-2fe2-47bc-8a97-5ebd0d9de473</uuid>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   <name>instance-0000000e</name>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteStrategies-server-178937293</nova:name>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:21:53</nova:creationTime>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:21:53 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:21:53 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:21:53 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:21:53 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:21:53 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:21:53 compute-0 nova_compute[192567]:         <nova:user uuid="bf38fbc8dd7b4c4db6c469a7951b0942">tempest-TestExecuteStrategies-1382092507-project-admin</nova:user>
Oct 02 08:21:53 compute-0 nova_compute[192567]:         <nova:project uuid="1ea832b474574009921dff909e4daeaf">tempest-TestExecuteStrategies-1382092507</nova:project>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:21:53 compute-0 nova_compute[192567]:         <nova:port uuid="9da9e2f3-1dbb-49c3-93ba-a8284319d5da">
Oct 02 08:21:53 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <system>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <entry name="serial">4215f34f-2fe2-47bc-8a97-5ebd0d9de473</entry>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <entry name="uuid">4215f34f-2fe2-47bc-8a97-5ebd0d9de473</entry>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     </system>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   <os>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   </os>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   <features>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   </features>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk.config"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:36:f0:2b"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <target dev="tap9da9e2f3-1d"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/console.log" append="off"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <video>
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     </video>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:21:53 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:21:53 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:21:53 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:21:53 compute-0 nova_compute[192567]: </domain>
Oct 02 08:21:53 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.058 2 DEBUG nova.compute.manager [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Preparing to wait for external event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.059 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.059 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.060 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.061 2 DEBUG nova.virt.libvirt.vif [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:21:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-178937293',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-178937293',id=14,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-au3gen8e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:21:49Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=4215f34f-2fe2-47bc-8a97-5ebd0d9de473,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.061 2 DEBUG nova.network.os_vif_util [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.062 2 DEBUG nova.network.os_vif_util [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:f0:2b,bridge_name='br-int',has_traffic_filtering=True,id=9da9e2f3-1dbb-49c3-93ba-a8284319d5da,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9da9e2f3-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.063 2 DEBUG os_vif [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:f0:2b,bridge_name='br-int',has_traffic_filtering=True,id=9da9e2f3-1dbb-49c3-93ba-a8284319d5da,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9da9e2f3-1d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.064 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.065 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.069 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9da9e2f3-1d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.070 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9da9e2f3-1d, col_values=(('external_ids', {'iface-id': '9da9e2f3-1dbb-49c3-93ba-a8284319d5da', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:f0:2b', 'vm-uuid': '4215f34f-2fe2-47bc-8a97-5ebd0d9de473'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:53 compute-0 NetworkManager[51654]: <info>  [1759393313.1130] manager: (tap9da9e2f3-1d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.122 2 INFO os_vif [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:f0:2b,bridge_name='br-int',has_traffic_filtering=True,id=9da9e2f3-1dbb-49c3-93ba-a8284319d5da,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9da9e2f3-1d')
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.191 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.195 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.196 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] No VIF found with MAC fa:16:3e:36:f0:2b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:21:53 compute-0 nova_compute[192567]: 2025-10-02 08:21:53.197 2 INFO nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Using config drive
Oct 02 08:21:54 compute-0 nova_compute[192567]: 2025-10-02 08:21:54.918 2 INFO nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Creating config drive at /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk.config
Oct 02 08:21:54 compute-0 nova_compute[192567]: 2025-10-02 08:21:54.927 2 DEBUG oslo_concurrency.processutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjmfslwaw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:21:55 compute-0 nova_compute[192567]: 2025-10-02 08:21:55.070 2 DEBUG oslo_concurrency.processutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjmfslwaw" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:21:55 compute-0 kernel: tap9da9e2f3-1d: entered promiscuous mode
Oct 02 08:21:55 compute-0 ovn_controller[94821]: 2025-10-02T08:21:55Z|00114|binding|INFO|Claiming lport 9da9e2f3-1dbb-49c3-93ba-a8284319d5da for this chassis.
Oct 02 08:21:55 compute-0 ovn_controller[94821]: 2025-10-02T08:21:55Z|00115|binding|INFO|9da9e2f3-1dbb-49c3-93ba-a8284319d5da: Claiming fa:16:3e:36:f0:2b 10.100.0.13
Oct 02 08:21:55 compute-0 nova_compute[192567]: 2025-10-02 08:21:55.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:55 compute-0 NetworkManager[51654]: <info>  [1759393315.1580] manager: (tap9da9e2f3-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Oct 02 08:21:55 compute-0 nova_compute[192567]: 2025-10-02 08:21:55.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:55 compute-0 nova_compute[192567]: 2025-10-02 08:21:55.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.177 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:f0:2b 10.100.0.13'], port_security=['fa:16:3e:36:f0:2b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4215f34f-2fe2-47bc-8a97-5ebd0d9de473', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=9da9e2f3-1dbb-49c3-93ba-a8284319d5da) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.179 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 9da9e2f3-1dbb-49c3-93ba-a8284319d5da in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d bound to our chassis
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.181 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.198 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f9e82a-f512-43ec-8a3f-ad17647a4371]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.200 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap08b16a0c-b1 in ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.203 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap08b16a0c-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.203 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1d7684-d32b-4aad-9f09-b0e097ee64e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.204 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[93e6642c-b03a-4a89-bf76-cc4843123ba7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 systemd-udevd[220006]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:21:55 compute-0 systemd-machined[152597]: New machine qemu-11-instance-0000000e.
Oct 02 08:21:55 compute-0 NetworkManager[51654]: <info>  [1759393315.2354] device (tap9da9e2f3-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:21:55 compute-0 NetworkManager[51654]: <info>  [1759393315.2377] device (tap9da9e2f3-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:21:55 compute-0 ovn_controller[94821]: 2025-10-02T08:21:55Z|00116|binding|INFO|Setting lport 9da9e2f3-1dbb-49c3-93ba-a8284319d5da ovn-installed in OVS
Oct 02 08:21:55 compute-0 ovn_controller[94821]: 2025-10-02T08:21:55Z|00117|binding|INFO|Setting lport 9da9e2f3-1dbb-49c3-93ba-a8284319d5da up in Southbound
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.234 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[5f9983eb-b64a-4e43-bb96-1a4de58d73f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 nova_compute[192567]: 2025-10-02 08:21:55.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:55 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000e.
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.279 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[c3fdfa7e-e482-4b66-a2ab-8e9e606fc5f6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.320 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[f0c4b122-8aa1-4d69-8ec1-ad82399e3e41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 NetworkManager[51654]: <info>  [1759393315.3259] manager: (tap08b16a0c-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Oct 02 08:21:55 compute-0 systemd-udevd[220009]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.326 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[ccf71bb5-c816-4361-b6f4-a5aa327d394e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.375 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[be453379-08f1-45f6-99a7-06a554f144fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.381 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[4df8bb59-1256-4a55-8009-d30900c5a9eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 NetworkManager[51654]: <info>  [1759393315.4191] device (tap08b16a0c-b0): carrier: link connected
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.430 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[12e4b077-0b15-4f64-8543-42ecbaa1b591]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.455 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[187e70fd-fd5a-4404-9138-fe21b48ca568]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420300, 'reachable_time': 35022, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220038, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.474 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[5fc9b5ed-9f6a-42fe-967c-d03dc5da46b7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:c53f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 420300, 'tstamp': 420300}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220039, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.496 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[169bba9c-0781-4019-b147-5e4e4f403322]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420300, 'reachable_time': 35022, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220040, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.545 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd5163c-a0b5-42c7-a7d0-9476e7be13a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.652 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[c92d72e2-3e35-4254-8806-96015abf4180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.654 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.654 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.655 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b16a0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:21:55 compute-0 NetworkManager[51654]: <info>  [1759393315.6584] manager: (tap08b16a0c-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Oct 02 08:21:55 compute-0 nova_compute[192567]: 2025-10-02 08:21:55.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:55 compute-0 kernel: tap08b16a0c-b0: entered promiscuous mode
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.664 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08b16a0c-b0, col_values=(('external_ids', {'iface-id': '748eef31-77a8-4b04-b6b7-dc0f7cc1cf65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:21:55 compute-0 ovn_controller[94821]: 2025-10-02T08:21:55Z|00118|binding|INFO|Releasing lport 748eef31-77a8-4b04-b6b7-dc0f7cc1cf65 from this chassis (sb_readonly=0)
Oct 02 08:21:55 compute-0 nova_compute[192567]: 2025-10-02 08:21:55.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:55 compute-0 nova_compute[192567]: 2025-10-02 08:21:55.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.668 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.670 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd04b55-8b52-4bf4-af6b-0ae31b123867]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.671 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:21:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:55.672 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'env', 'PROCESS_TAG=haproxy-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/08b16a0c-b69f-4a34-9bfe-830099adfe8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:21:55 compute-0 nova_compute[192567]: 2025-10-02 08:21:55.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:56 compute-0 podman[220079]: 2025-10-02 08:21:56.113052971 +0000 UTC m=+0.070331638 container create 3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.146 2 DEBUG nova.compute.manager [req-719e231f-2acd-4ff5-98c9-a2dc0426846e req-b1b763cd-0ec3-4874-9a68-99e02da424a4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.147 2 DEBUG oslo_concurrency.lockutils [req-719e231f-2acd-4ff5-98c9-a2dc0426846e req-b1b763cd-0ec3-4874-9a68-99e02da424a4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.147 2 DEBUG oslo_concurrency.lockutils [req-719e231f-2acd-4ff5-98c9-a2dc0426846e req-b1b763cd-0ec3-4874-9a68-99e02da424a4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.148 2 DEBUG oslo_concurrency.lockutils [req-719e231f-2acd-4ff5-98c9-a2dc0426846e req-b1b763cd-0ec3-4874-9a68-99e02da424a4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.148 2 DEBUG nova.compute.manager [req-719e231f-2acd-4ff5-98c9-a2dc0426846e req-b1b763cd-0ec3-4874-9a68-99e02da424a4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Processing event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 02 08:21:56 compute-0 systemd[1]: Started libpod-conmon-3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46.scope.
Oct 02 08:21:56 compute-0 podman[220079]: 2025-10-02 08:21:56.075736266 +0000 UTC m=+0.033014983 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:21:56 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:21:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/812d08d9f9b5cc983b2055f6b21bf81ce19cb528095f639646e7869f666a2fea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:56.196 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.203 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393316.202601, 4215f34f-2fe2-47bc-8a97-5ebd0d9de473 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.204 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] VM Started (Lifecycle Event)
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.206 2 DEBUG nova.compute.manager [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:21:56 compute-0 podman[220079]: 2025-10-02 08:21:56.21128265 +0000 UTC m=+0.168561317 container init 3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.211 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.216 2 INFO nova.virt.libvirt.driver [-] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Instance spawned successfully.
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.217 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 02 08:21:56 compute-0 podman[220079]: 2025-10-02 08:21:56.224398225 +0000 UTC m=+0.181676862 container start 3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.241 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.246 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.250 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.250 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.251 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.251 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.251 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.252 2 DEBUG nova.virt.libvirt.driver [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:21:56 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220094]: [NOTICE]   (220098) : New worker (220100) forked
Oct 02 08:21:56 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220094]: [NOTICE]   (220098) : Loading success.
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.276 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.277 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393316.2028203, 4215f34f-2fe2-47bc-8a97-5ebd0d9de473 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.278 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] VM Paused (Lifecycle Event)
Oct 02 08:21:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:56.283 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:21:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:21:56.284 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.296 2 DEBUG nova.network.neutron [req-1b23e2f7-2751-424a-9d9f-eb87545df935 req-56166a26-c992-4e93-b4ad-5ab834d4b2ce 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Updated VIF entry in instance network info cache for port 9da9e2f3-1dbb-49c3-93ba-a8284319d5da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.296 2 DEBUG nova.network.neutron [req-1b23e2f7-2751-424a-9d9f-eb87545df935 req-56166a26-c992-4e93-b4ad-5ab834d4b2ce 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Updating instance_info_cache with network_info: [{"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.303 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.305 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393316.2103648, 4215f34f-2fe2-47bc-8a97-5ebd0d9de473 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.306 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] VM Resumed (Lifecycle Event)
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.317 2 DEBUG oslo_concurrency.lockutils [req-1b23e2f7-2751-424a-9d9f-eb87545df935 req-56166a26-c992-4e93-b4ad-5ab834d4b2ce 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-4215f34f-2fe2-47bc-8a97-5ebd0d9de473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.325 2 INFO nova.compute.manager [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Took 6.53 seconds to spawn the instance on the hypervisor.
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.326 2 DEBUG nova.compute.manager [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.327 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.332 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.354 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.376 2 INFO nova.compute.manager [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Took 7.01 seconds to build instance.
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.388 2 DEBUG oslo_concurrency.lockutils [None req-c03164d0-9732-49b5-a242-4bf7ed633e24 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:21:56 compute-0 nova_compute[192567]: 2025-10-02 08:21:56.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:58 compute-0 nova_compute[192567]: 2025-10-02 08:21:58.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:21:58 compute-0 nova_compute[192567]: 2025-10-02 08:21:58.235 2 DEBUG nova.compute.manager [req-c3d88da4-fee4-4509-9ae7-a98e980df85f req-5c4e2df3-8937-4b13-971f-33588a09c1f4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:21:58 compute-0 nova_compute[192567]: 2025-10-02 08:21:58.236 2 DEBUG oslo_concurrency.lockutils [req-c3d88da4-fee4-4509-9ae7-a98e980df85f req-5c4e2df3-8937-4b13-971f-33588a09c1f4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:21:58 compute-0 nova_compute[192567]: 2025-10-02 08:21:58.236 2 DEBUG oslo_concurrency.lockutils [req-c3d88da4-fee4-4509-9ae7-a98e980df85f req-5c4e2df3-8937-4b13-971f-33588a09c1f4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:21:58 compute-0 nova_compute[192567]: 2025-10-02 08:21:58.236 2 DEBUG oslo_concurrency.lockutils [req-c3d88da4-fee4-4509-9ae7-a98e980df85f req-5c4e2df3-8937-4b13-971f-33588a09c1f4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:21:58 compute-0 nova_compute[192567]: 2025-10-02 08:21:58.236 2 DEBUG nova.compute.manager [req-c3d88da4-fee4-4509-9ae7-a98e980df85f req-5c4e2df3-8937-4b13-971f-33588a09c1f4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] No waiting events found dispatching network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:21:58 compute-0 nova_compute[192567]: 2025-10-02 08:21:58.236 2 WARNING nova.compute.manager [req-c3d88da4-fee4-4509-9ae7-a98e980df85f req-5c4e2df3-8937-4b13-971f-33588a09c1f4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received unexpected event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da for instance with vm_state active and task_state None.
Oct 02 08:21:59 compute-0 podman[203011]: time="2025-10-02T08:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:21:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:21:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3466 "" "Go-http-client/1.1"
Oct 02 08:22:00 compute-0 podman[220111]: 2025-10-02 08:22:00.211106843 +0000 UTC m=+0.104804804 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 02 08:22:00 compute-0 podman[220109]: 2025-10-02 08:22:00.223016941 +0000 UTC m=+0.123453570 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:22:00 compute-0 podman[220112]: 2025-10-02 08:22:00.284917907 +0000 UTC m=+0.166518763 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct 02 08:22:00 compute-0 podman[220110]: 2025-10-02 08:22:00.332191059 +0000 UTC m=+0.228911803 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:22:01 compute-0 openstack_network_exporter[205118]: ERROR   08:22:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:22:01 compute-0 openstack_network_exporter[205118]: ERROR   08:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:22:01 compute-0 openstack_network_exporter[205118]: ERROR   08:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:22:01 compute-0 openstack_network_exporter[205118]: ERROR   08:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:22:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:22:01 compute-0 openstack_network_exporter[205118]: ERROR   08:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:22:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:22:01 compute-0 nova_compute[192567]: 2025-10-02 08:22:01.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:03 compute-0 nova_compute[192567]: 2025-10-02 08:22:03.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:04 compute-0 nova_compute[192567]: 2025-10-02 08:22:04.924 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:22:04 compute-0 nova_compute[192567]: 2025-10-02 08:22:04.948 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Triggering sync for uuid 4215f34f-2fe2-47bc-8a97-5ebd0d9de473 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Oct 02 08:22:04 compute-0 nova_compute[192567]: 2025-10-02 08:22:04.949 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:04 compute-0 nova_compute[192567]: 2025-10-02 08:22:04.950 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:04 compute-0 nova_compute[192567]: 2025-10-02 08:22:04.984 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:05 compute-0 podman[220194]: 2025-10-02 08:22:05.180223057 +0000 UTC m=+0.085551288 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:22:06 compute-0 nova_compute[192567]: 2025-10-02 08:22:06.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:07 compute-0 ovn_controller[94821]: 2025-10-02T08:22:07Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:36:f0:2b 10.100.0.13
Oct 02 08:22:07 compute-0 ovn_controller[94821]: 2025-10-02T08:22:07Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:36:f0:2b 10.100.0.13
Oct 02 08:22:08 compute-0 nova_compute[192567]: 2025-10-02 08:22:08.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:11 compute-0 nova_compute[192567]: 2025-10-02 08:22:11.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:13 compute-0 nova_compute[192567]: 2025-10-02 08:22:13.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:16 compute-0 nova_compute[192567]: 2025-10-02 08:22:16.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:18 compute-0 podman[220235]: 2025-10-02 08:22:18.141382402 +0000 UTC m=+0.058671567 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public)
Oct 02 08:22:18 compute-0 nova_compute[192567]: 2025-10-02 08:22:18.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:21 compute-0 nova_compute[192567]: 2025-10-02 08:22:21.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:23 compute-0 nova_compute[192567]: 2025-10-02 08:22:23.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:25 compute-0 ovn_controller[94821]: 2025-10-02T08:22:25Z|00119|memory_trim|INFO|Detected inactivity (last active 30025 ms ago): trimming memory
Oct 02 08:22:26 compute-0 nova_compute[192567]: 2025-10-02 08:22:26.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:28 compute-0 nova_compute[192567]: 2025-10-02 08:22:28.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:29 compute-0 podman[203011]: time="2025-10-02T08:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:22:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:22:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3468 "" "Go-http-client/1.1"
Oct 02 08:22:30 compute-0 nova_compute[192567]: 2025-10-02 08:22:30.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:22:30 compute-0 nova_compute[192567]: 2025-10-02 08:22:30.656 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:30 compute-0 nova_compute[192567]: 2025-10-02 08:22:30.657 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:30 compute-0 nova_compute[192567]: 2025-10-02 08:22:30.658 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:30 compute-0 nova_compute[192567]: 2025-10-02 08:22:30.658 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:22:30 compute-0 nova_compute[192567]: 2025-10-02 08:22:30.762 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:22:30 compute-0 podman[220258]: 2025-10-02 08:22:30.825025411 +0000 UTC m=+0.103416470 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 02 08:22:30 compute-0 podman[220260]: 2025-10-02 08:22:30.849739667 +0000 UTC m=+0.117738045 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Oct 02 08:22:30 compute-0 nova_compute[192567]: 2025-10-02 08:22:30.851 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:22:30 compute-0 nova_compute[192567]: 2025-10-02 08:22:30.853 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:22:30 compute-0 podman[220261]: 2025-10-02 08:22:30.873279505 +0000 UTC m=+0.132827691 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 02 08:22:30 compute-0 podman[220259]: 2025-10-02 08:22:30.881097746 +0000 UTC m=+0.146735241 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:22:30 compute-0 nova_compute[192567]: 2025-10-02 08:22:30.918 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:22:31 compute-0 nova_compute[192567]: 2025-10-02 08:22:31.137 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:22:31 compute-0 nova_compute[192567]: 2025-10-02 08:22:31.138 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5711MB free_disk=73.43696212768555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:22:31 compute-0 nova_compute[192567]: 2025-10-02 08:22:31.139 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:31 compute-0 nova_compute[192567]: 2025-10-02 08:22:31.139 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:31 compute-0 nova_compute[192567]: 2025-10-02 08:22:31.262 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance 4215f34f-2fe2-47bc-8a97-5ebd0d9de473 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:22:31 compute-0 nova_compute[192567]: 2025-10-02 08:22:31.263 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:22:31 compute-0 nova_compute[192567]: 2025-10-02 08:22:31.263 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:22:31 compute-0 openstack_network_exporter[205118]: ERROR   08:22:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:22:31 compute-0 openstack_network_exporter[205118]: ERROR   08:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:22:31 compute-0 openstack_network_exporter[205118]: ERROR   08:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:22:31 compute-0 openstack_network_exporter[205118]: ERROR   08:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:22:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:22:31 compute-0 openstack_network_exporter[205118]: ERROR   08:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:22:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:22:31 compute-0 nova_compute[192567]: 2025-10-02 08:22:31.438 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:22:31 compute-0 nova_compute[192567]: 2025-10-02 08:22:31.455 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:22:31 compute-0 nova_compute[192567]: 2025-10-02 08:22:31.479 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:22:31 compute-0 nova_compute[192567]: 2025-10-02 08:22:31.480 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:31 compute-0 nova_compute[192567]: 2025-10-02 08:22:31.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:33 compute-0 nova_compute[192567]: 2025-10-02 08:22:33.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:34 compute-0 nova_compute[192567]: 2025-10-02 08:22:34.481 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:22:34 compute-0 nova_compute[192567]: 2025-10-02 08:22:34.481 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:22:34 compute-0 nova_compute[192567]: 2025-10-02 08:22:34.482 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:22:34 compute-0 nova_compute[192567]: 2025-10-02 08:22:34.987 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-4215f34f-2fe2-47bc-8a97-5ebd0d9de473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:22:34 compute-0 nova_compute[192567]: 2025-10-02 08:22:34.988 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-4215f34f-2fe2-47bc-8a97-5ebd0d9de473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:22:34 compute-0 nova_compute[192567]: 2025-10-02 08:22:34.988 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:22:34 compute-0 nova_compute[192567]: 2025-10-02 08:22:34.988 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4215f34f-2fe2-47bc-8a97-5ebd0d9de473 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:22:36 compute-0 podman[220339]: 2025-10-02 08:22:36.198358861 +0000 UTC m=+0.108034603 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:22:36 compute-0 nova_compute[192567]: 2025-10-02 08:22:36.405 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Updating instance_info_cache with network_info: [{"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:22:36 compute-0 nova_compute[192567]: 2025-10-02 08:22:36.428 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-4215f34f-2fe2-47bc-8a97-5ebd0d9de473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:22:36 compute-0 nova_compute[192567]: 2025-10-02 08:22:36.428 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:22:36 compute-0 nova_compute[192567]: 2025-10-02 08:22:36.429 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:22:36 compute-0 nova_compute[192567]: 2025-10-02 08:22:36.430 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:22:36 compute-0 nova_compute[192567]: 2025-10-02 08:22:36.430 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:22:36 compute-0 nova_compute[192567]: 2025-10-02 08:22:36.564 2 DEBUG nova.compute.manager [None req-a663ad7f-bd36-4833-a495-c741ee03c35a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Oct 02 08:22:36 compute-0 nova_compute[192567]: 2025-10-02 08:22:36.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:36 compute-0 nova_compute[192567]: 2025-10-02 08:22:36.652 2 DEBUG nova.compute.provider_tree [None req-a663ad7f-bd36-4833-a495-c741ee03c35a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Updating resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e generation from 17 to 19 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 02 08:22:37 compute-0 nova_compute[192567]: 2025-10-02 08:22:37.568 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:22:38 compute-0 nova_compute[192567]: 2025-10-02 08:22:38.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:38 compute-0 nova_compute[192567]: 2025-10-02 08:22:38.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:22:40 compute-0 nova_compute[192567]: 2025-10-02 08:22:40.420 2 DEBUG nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Check if temp file /var/lib/nova/instances/tmpn4gwn2ko exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Oct 02 08:22:40 compute-0 nova_compute[192567]: 2025-10-02 08:22:40.421 2 DEBUG nova.compute.manager [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpn4gwn2ko',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4215f34f-2fe2-47bc-8a97-5ebd0d9de473',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Oct 02 08:22:40 compute-0 nova_compute[192567]: 2025-10-02 08:22:40.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:22:40 compute-0 nova_compute[192567]: 2025-10-02 08:22:40.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:22:40 compute-0 nova_compute[192567]: 2025-10-02 08:22:40.968 2 DEBUG oslo_concurrency.processutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:22:41 compute-0 nova_compute[192567]: 2025-10-02 08:22:41.058 2 DEBUG oslo_concurrency.processutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:22:41 compute-0 nova_compute[192567]: 2025-10-02 08:22:41.060 2 DEBUG oslo_concurrency.processutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:22:41 compute-0 nova_compute[192567]: 2025-10-02 08:22:41.127 2 DEBUG oslo_concurrency.processutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:22:41 compute-0 nova_compute[192567]: 2025-10-02 08:22:41.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:43 compute-0 nova_compute[192567]: 2025-10-02 08:22:43.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:44 compute-0 nova_compute[192567]: 2025-10-02 08:22:44.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:22:44 compute-0 sshd-session[220371]: Accepted publickey for nova from 192.168.122.101 port 47982 ssh2: ECDSA SHA256:nyj9easCU2+zJyxXdAvgdE/0ePVxCLkFf7X2/rv3WZg
Oct 02 08:22:44 compute-0 systemd-logind[827]: New session 37 of user nova.
Oct 02 08:22:44 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Oct 02 08:22:44 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct 02 08:22:44 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct 02 08:22:44 compute-0 systemd[1]: Starting User Manager for UID 42436...
Oct 02 08:22:44 compute-0 systemd[220375]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:22:45 compute-0 systemd[220375]: Queued start job for default target Main User Target.
Oct 02 08:22:45 compute-0 systemd[220375]: Created slice User Application Slice.
Oct 02 08:22:45 compute-0 systemd[220375]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 02 08:22:45 compute-0 systemd[220375]: Started Daily Cleanup of User's Temporary Directories.
Oct 02 08:22:45 compute-0 systemd[220375]: Reached target Paths.
Oct 02 08:22:45 compute-0 systemd[220375]: Reached target Timers.
Oct 02 08:22:45 compute-0 systemd[220375]: Starting D-Bus User Message Bus Socket...
Oct 02 08:22:45 compute-0 systemd[220375]: Starting Create User's Volatile Files and Directories...
Oct 02 08:22:45 compute-0 systemd[220375]: Listening on D-Bus User Message Bus Socket.
Oct 02 08:22:45 compute-0 systemd[220375]: Reached target Sockets.
Oct 02 08:22:45 compute-0 systemd[220375]: Finished Create User's Volatile Files and Directories.
Oct 02 08:22:45 compute-0 systemd[220375]: Reached target Basic System.
Oct 02 08:22:45 compute-0 systemd[220375]: Reached target Main User Target.
Oct 02 08:22:45 compute-0 systemd[220375]: Startup finished in 183ms.
Oct 02 08:22:45 compute-0 systemd[1]: Started User Manager for UID 42436.
Oct 02 08:22:45 compute-0 systemd[1]: Started Session 37 of User nova.
Oct 02 08:22:45 compute-0 sshd-session[220371]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:22:45 compute-0 sshd-session[220390]: Received disconnect from 192.168.122.101 port 47982:11: disconnected by user
Oct 02 08:22:45 compute-0 sshd-session[220390]: Disconnected from user nova 192.168.122.101 port 47982
Oct 02 08:22:45 compute-0 sshd-session[220371]: pam_unix(sshd:session): session closed for user nova
Oct 02 08:22:45 compute-0 systemd-logind[827]: Session 37 logged out. Waiting for processes to exit.
Oct 02 08:22:45 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Oct 02 08:22:45 compute-0 systemd-logind[827]: Removed session 37.
Oct 02 08:22:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:45.979 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:45.980 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:45.981 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:46 compute-0 nova_compute[192567]: 2025-10-02 08:22:46.411 2 DEBUG nova.compute.manager [req-f3a351f6-6475-4768-ad82-e97d12425481 req-273a7ead-78bd-426a-beb6-061245fdb9c5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-unplugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:22:46 compute-0 nova_compute[192567]: 2025-10-02 08:22:46.411 2 DEBUG oslo_concurrency.lockutils [req-f3a351f6-6475-4768-ad82-e97d12425481 req-273a7ead-78bd-426a-beb6-061245fdb9c5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:46.411 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:22:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:46.412 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:22:46 compute-0 nova_compute[192567]: 2025-10-02 08:22:46.412 2 DEBUG oslo_concurrency.lockutils [req-f3a351f6-6475-4768-ad82-e97d12425481 req-273a7ead-78bd-426a-beb6-061245fdb9c5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:46 compute-0 nova_compute[192567]: 2025-10-02 08:22:46.412 2 DEBUG oslo_concurrency.lockutils [req-f3a351f6-6475-4768-ad82-e97d12425481 req-273a7ead-78bd-426a-beb6-061245fdb9c5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:46 compute-0 nova_compute[192567]: 2025-10-02 08:22:46.413 2 DEBUG nova.compute.manager [req-f3a351f6-6475-4768-ad82-e97d12425481 req-273a7ead-78bd-426a-beb6-061245fdb9c5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] No waiting events found dispatching network-vif-unplugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:22:46 compute-0 nova_compute[192567]: 2025-10-02 08:22:46.413 2 DEBUG nova.compute.manager [req-f3a351f6-6475-4768-ad82-e97d12425481 req-273a7ead-78bd-426a-beb6-061245fdb9c5 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-unplugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:22:46 compute-0 nova_compute[192567]: 2025-10-02 08:22:46.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:46 compute-0 nova_compute[192567]: 2025-10-02 08:22:46.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.116 2 INFO nova.compute.manager [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Took 5.99 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.117 2 DEBUG nova.compute.manager [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.134 2 DEBUG nova.compute.manager [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpn4gwn2ko',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4215f34f-2fe2-47bc-8a97-5ebd0d9de473',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(8e3d4b13-3ffe-4766-b46e-5601a638df6c),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.159 2 DEBUG nova.objects.instance [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 4215f34f-2fe2-47bc-8a97-5ebd0d9de473 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.160 2 DEBUG nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.163 2 DEBUG nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.163 2 DEBUG nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.185 2 DEBUG nova.virt.libvirt.vif [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:21:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-178937293',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-178937293',id=14,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:21:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-au3gen8e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:21:56Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=4215f34f-2fe2-47bc-8a97-5ebd0d9de473,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.185 2 DEBUG nova.network.os_vif_util [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.187 2 DEBUG nova.network.os_vif_util [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:f0:2b,bridge_name='br-int',has_traffic_filtering=True,id=9da9e2f3-1dbb-49c3-93ba-a8284319d5da,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9da9e2f3-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.188 2 DEBUG nova.virt.libvirt.migration [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Updating guest XML with vif config: <interface type="ethernet">
Oct 02 08:22:47 compute-0 nova_compute[192567]:   <mac address="fa:16:3e:36:f0:2b"/>
Oct 02 08:22:47 compute-0 nova_compute[192567]:   <model type="virtio"/>
Oct 02 08:22:47 compute-0 nova_compute[192567]:   <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:22:47 compute-0 nova_compute[192567]:   <mtu size="1442"/>
Oct 02 08:22:47 compute-0 nova_compute[192567]:   <target dev="tap9da9e2f3-1d"/>
Oct 02 08:22:47 compute-0 nova_compute[192567]: </interface>
Oct 02 08:22:47 compute-0 nova_compute[192567]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.189 2 DEBUG nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.666 2 DEBUG nova.virt.libvirt.migration [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.666 2 INFO nova.virt.libvirt.migration [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Increasing downtime to 50 ms after 0 sec elapsed time
Oct 02 08:22:47 compute-0 nova_compute[192567]: 2025-10-02 08:22:47.735 2 INFO nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.260 2 DEBUG nova.virt.libvirt.migration [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.260 2 DEBUG nova.virt.libvirt.migration [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.510 2 DEBUG nova.compute.manager [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.510 2 DEBUG oslo_concurrency.lockutils [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.510 2 DEBUG oslo_concurrency.lockutils [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.511 2 DEBUG oslo_concurrency.lockutils [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.511 2 DEBUG nova.compute.manager [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] No waiting events found dispatching network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.511 2 WARNING nova.compute.manager [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received unexpected event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da for instance with vm_state active and task_state migrating.
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.512 2 DEBUG nova.compute.manager [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-changed-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.512 2 DEBUG nova.compute.manager [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Refreshing instance network info cache due to event network-changed-9da9e2f3-1dbb-49c3-93ba-a8284319d5da. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.512 2 DEBUG oslo_concurrency.lockutils [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-4215f34f-2fe2-47bc-8a97-5ebd0d9de473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.512 2 DEBUG oslo_concurrency.lockutils [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-4215f34f-2fe2-47bc-8a97-5ebd0d9de473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.513 2 DEBUG nova.network.neutron [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Refreshing network info cache for port 9da9e2f3-1dbb-49c3-93ba-a8284319d5da _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.764 2 DEBUG nova.virt.libvirt.migration [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:22:48 compute-0 nova_compute[192567]: 2025-10-02 08:22:48.764 2 DEBUG nova.virt.libvirt.migration [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:22:49 compute-0 podman[220402]: 2025-10-02 08:22:49.193760156 +0000 UTC m=+0.095005051 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.263 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393369.2630494, 4215f34f-2fe2-47bc-8a97-5ebd0d9de473 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.263 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] VM Paused (Lifecycle Event)
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.268 2 DEBUG nova.virt.libvirt.migration [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.268 2 DEBUG nova.virt.libvirt.migration [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.284 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.289 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.314 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] During sync_power_state the instance has a pending task (migrating). Skip.
Oct 02 08:22:49 compute-0 kernel: tap9da9e2f3-1d (unregistering): left promiscuous mode
Oct 02 08:22:49 compute-0 NetworkManager[51654]: <info>  [1759393369.4166] device (tap9da9e2f3-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.415 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:49 compute-0 ovn_controller[94821]: 2025-10-02T08:22:49Z|00120|binding|INFO|Releasing lport 9da9e2f3-1dbb-49c3-93ba-a8284319d5da from this chassis (sb_readonly=0)
Oct 02 08:22:49 compute-0 ovn_controller[94821]: 2025-10-02T08:22:49Z|00121|binding|INFO|Setting lport 9da9e2f3-1dbb-49c3-93ba-a8284319d5da down in Southbound
Oct 02 08:22:49 compute-0 ovn_controller[94821]: 2025-10-02T08:22:49Z|00122|binding|INFO|Removing iface tap9da9e2f3-1d ovn-installed in OVS
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.439 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:f0:2b 10.100.0.13'], port_security=['fa:16:3e:36:f0:2b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '61f597a0-da80-455c-aab0-956a1e15f143'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4215f34f-2fe2-47bc-8a97-5ebd0d9de473', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=9da9e2f3-1dbb-49c3-93ba-a8284319d5da) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.441 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 9da9e2f3-1dbb-49c3-93ba-a8284319d5da in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d unbound from our chassis
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.442 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.444 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[352ea06b-d684-45b5-91c5-5d68429c6ccf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.445 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d namespace which is not needed anymore
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:49 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Oct 02 08:22:49 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000e.scope: Consumed 14.509s CPU time.
Oct 02 08:22:49 compute-0 systemd-machined[152597]: Machine qemu-11-instance-0000000e terminated.
Oct 02 08:22:49 compute-0 kernel: tap9da9e2f3-1d: entered promiscuous mode
Oct 02 08:22:49 compute-0 NetworkManager[51654]: <info>  [1759393369.6188] manager: (tap9da9e2f3-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Oct 02 08:22:49 compute-0 ovn_controller[94821]: 2025-10-02T08:22:49Z|00123|binding|INFO|Claiming lport 9da9e2f3-1dbb-49c3-93ba-a8284319d5da for this chassis.
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:49 compute-0 ovn_controller[94821]: 2025-10-02T08:22:49Z|00124|binding|INFO|9da9e2f3-1dbb-49c3-93ba-a8284319d5da: Claiming fa:16:3e:36:f0:2b 10.100.0.13
Oct 02 08:22:49 compute-0 kernel: tap9da9e2f3-1d (unregistering): left promiscuous mode
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.632 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:f0:2b 10.100.0.13'], port_security=['fa:16:3e:36:f0:2b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '61f597a0-da80-455c-aab0-956a1e15f143'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4215f34f-2fe2-47bc-8a97-5ebd0d9de473', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=9da9e2f3-1dbb-49c3-93ba-a8284319d5da) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.636 2 DEBUG nova.compute.manager [req-f890d020-6f63-4e62-bef7-cd35c816a6cf req-09f2b4a2-7c0b-46b8-9d86-f8332134adb1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-unplugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.636 2 DEBUG oslo_concurrency.lockutils [req-f890d020-6f63-4e62-bef7-cd35c816a6cf req-09f2b4a2-7c0b-46b8-9d86-f8332134adb1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.637 2 DEBUG oslo_concurrency.lockutils [req-f890d020-6f63-4e62-bef7-cd35c816a6cf req-09f2b4a2-7c0b-46b8-9d86-f8332134adb1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.637 2 DEBUG oslo_concurrency.lockutils [req-f890d020-6f63-4e62-bef7-cd35c816a6cf req-09f2b4a2-7c0b-46b8-9d86-f8332134adb1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.637 2 DEBUG nova.compute.manager [req-f890d020-6f63-4e62-bef7-cd35c816a6cf req-09f2b4a2-7c0b-46b8-9d86-f8332134adb1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] No waiting events found dispatching network-vif-unplugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.637 2 DEBUG nova.compute.manager [req-f890d020-6f63-4e62-bef7-cd35c816a6cf req-09f2b4a2-7c0b-46b8-9d86-f8332134adb1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-unplugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:22:49 compute-0 ovn_controller[94821]: 2025-10-02T08:22:49Z|00125|binding|INFO|Setting lport 9da9e2f3-1dbb-49c3-93ba-a8284319d5da ovn-installed in OVS
Oct 02 08:22:49 compute-0 ovn_controller[94821]: 2025-10-02T08:22:49Z|00126|binding|INFO|Setting lport 9da9e2f3-1dbb-49c3-93ba-a8284319d5da up in Southbound
Oct 02 08:22:49 compute-0 ovn_controller[94821]: 2025-10-02T08:22:49Z|00127|binding|INFO|Releasing lport 9da9e2f3-1dbb-49c3-93ba-a8284319d5da from this chassis (sb_readonly=1)
Oct 02 08:22:49 compute-0 ovn_controller[94821]: 2025-10-02T08:22:49Z|00128|if_status|INFO|Not setting lport 9da9e2f3-1dbb-49c3-93ba-a8284319d5da down as sb is readonly
Oct 02 08:22:49 compute-0 ovn_controller[94821]: 2025-10-02T08:22:49Z|00129|binding|INFO|Removing iface tap9da9e2f3-1d ovn-installed in OVS
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:49 compute-0 ovn_controller[94821]: 2025-10-02T08:22:49Z|00130|binding|INFO|Releasing lport 9da9e2f3-1dbb-49c3-93ba-a8284319d5da from this chassis (sb_readonly=0)
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:49 compute-0 ovn_controller[94821]: 2025-10-02T08:22:49Z|00131|binding|INFO|Setting lport 9da9e2f3-1dbb-49c3-93ba-a8284319d5da down in Southbound
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.676 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:f0:2b 10.100.0.13'], port_security=['fa:16:3e:36:f0:2b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '61f597a0-da80-455c-aab0-956a1e15f143'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4215f34f-2fe2-47bc-8a97-5ebd0d9de473', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=9da9e2f3-1dbb-49c3-93ba-a8284319d5da) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:22:49 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220094]: [NOTICE]   (220098) : haproxy version is 2.8.14-c23fe91
Oct 02 08:22:49 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220094]: [NOTICE]   (220098) : path to executable is /usr/sbin/haproxy
Oct 02 08:22:49 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220094]: [WARNING]  (220098) : Exiting Master process...
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:49 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220094]: [ALERT]    (220098) : Current worker (220100) exited with code 143 (Terminated)
Oct 02 08:22:49 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220094]: [WARNING]  (220098) : All workers exited. Exiting... (0)
Oct 02 08:22:49 compute-0 systemd[1]: libpod-3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46.scope: Deactivated successfully.
Oct 02 08:22:49 compute-0 podman[220447]: 2025-10-02 08:22:49.700912347 +0000 UTC m=+0.111040386 container died 3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.706 2 DEBUG nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.706 2 DEBUG nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.706 2 DEBUG nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Oct 02 08:22:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-812d08d9f9b5cc983b2055f6b21bf81ce19cb528095f639646e7869f666a2fea-merged.mount: Deactivated successfully.
Oct 02 08:22:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46-userdata-shm.mount: Deactivated successfully.
Oct 02 08:22:49 compute-0 podman[220447]: 2025-10-02 08:22:49.751798621 +0000 UTC m=+0.161926700 container cleanup 3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.771 2 DEBUG nova.virt.libvirt.guest [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '4215f34f-2fe2-47bc-8a97-5ebd0d9de473' (instance-0000000e) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.771 2 INFO nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Migration operation has completed
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.771 2 INFO nova.compute.manager [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] _post_live_migration() is started..
Oct 02 08:22:49 compute-0 systemd[1]: libpod-conmon-3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46.scope: Deactivated successfully.
Oct 02 08:22:49 compute-0 podman[220484]: 2025-10-02 08:22:49.849892476 +0000 UTC m=+0.062875676 container remove 3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.858 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[5fc0a232-7c8a-4c04-9d3e-e0bdf7ea75a5]: (4, ('Thu Oct  2 08:22:49 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d (3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46)\n3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46\nThu Oct  2 08:22:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d (3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46)\n3f5926323bdfbb9b2125b3b57a050079554cd0c703a4d41d435398833c34eb46\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.861 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[028e41da-faaf-4cdc-b129-ff6fef30e9a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.862 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:49 compute-0 kernel: tap08b16a0c-b0: left promiscuous mode
Oct 02 08:22:49 compute-0 nova_compute[192567]: 2025-10-02 08:22:49.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.902 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[c6268958-4146-4f4f-bc3d-29365d325b2b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.927 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[00d1e7ad-007b-4c0c-a46d-e32a123e5dff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.928 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[384cdd6e-a764-4c12-b795-eaeb241dfe9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.953 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[620c8b95-a460-40d6-83ce-b795de615f5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 420290, 'reachable_time': 26586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220503, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:22:49 compute-0 systemd[1]: run-netns-ovnmeta\x2d08b16a0c\x2db69f\x2d4a34\x2d9bfe\x2d830099adfe8d.mount: Deactivated successfully.
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.956 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.956 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a07b0a-90eb-4f62-9502-fe376f48469e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.959 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 9da9e2f3-1dbb-49c3-93ba-a8284319d5da in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d unbound from our chassis
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.962 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.965 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[ae4f9303-6f21-4cd2-9e2f-a3b92d2fc614]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.966 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 9da9e2f3-1dbb-49c3-93ba-a8284319d5da in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d unbound from our chassis
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.968 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:22:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:22:49.968 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[6435f26b-dae2-46ea-9eeb-74a6166cd13f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:22:50 compute-0 nova_compute[192567]: 2025-10-02 08:22:50.195 2 DEBUG nova.network.neutron [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Updated VIF entry in instance network info cache for port 9da9e2f3-1dbb-49c3-93ba-a8284319d5da. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:22:50 compute-0 nova_compute[192567]: 2025-10-02 08:22:50.196 2 DEBUG nova.network.neutron [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Updating instance_info_cache with network_info: [{"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:22:50 compute-0 nova_compute[192567]: 2025-10-02 08:22:50.215 2 DEBUG oslo_concurrency.lockutils [req-d7dea0c5-99dc-4987-a9df-057a30a02b33 req-1131363f-089a-4f5b-b837-6053a3109f35 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-4215f34f-2fe2-47bc-8a97-5ebd0d9de473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.180 2 DEBUG nova.network.neutron [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Activated binding for port 9da9e2f3-1dbb-49c3-93ba-a8284319d5da and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.181 2 DEBUG nova.compute.manager [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.183 2 DEBUG nova.virt.libvirt.vif [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:21:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-178937293',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-178937293',id=14,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:21:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-au3gen8e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:22:38Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=4215f34f-2fe2-47bc-8a97-5ebd0d9de473,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.184 2 DEBUG nova.network.os_vif_util [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "address": "fa:16:3e:36:f0:2b", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9da9e2f3-1d", "ovs_interfaceid": "9da9e2f3-1dbb-49c3-93ba-a8284319d5da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.186 2 DEBUG nova.network.os_vif_util [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:f0:2b,bridge_name='br-int',has_traffic_filtering=True,id=9da9e2f3-1dbb-49c3-93ba-a8284319d5da,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9da9e2f3-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.187 2 DEBUG os_vif [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:f0:2b,bridge_name='br-int',has_traffic_filtering=True,id=9da9e2f3-1dbb-49c3-93ba-a8284319d5da,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9da9e2f3-1d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.190 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9da9e2f3-1d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.198 2 INFO os_vif [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:f0:2b,bridge_name='br-int',has_traffic_filtering=True,id=9da9e2f3-1dbb-49c3-93ba-a8284319d5da,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9da9e2f3-1d')
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.199 2 DEBUG oslo_concurrency.lockutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.199 2 DEBUG oslo_concurrency.lockutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.200 2 DEBUG oslo_concurrency.lockutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.200 2 DEBUG nova.compute.manager [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.201 2 INFO nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Deleting instance files /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473_del
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.202 2 INFO nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Deletion of /var/lib/nova/instances/4215f34f-2fe2-47bc-8a97-5ebd0d9de473_del complete
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.756 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.757 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.757 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.758 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.758 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] No waiting events found dispatching network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.759 2 WARNING nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received unexpected event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da for instance with vm_state active and task_state migrating.
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.759 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.760 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.760 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.760 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.761 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] No waiting events found dispatching network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.761 2 WARNING nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received unexpected event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da for instance with vm_state active and task_state migrating.
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.761 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-unplugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.762 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.762 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.763 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.763 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] No waiting events found dispatching network-vif-unplugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.763 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-unplugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.764 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.764 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.765 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.765 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.765 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] No waiting events found dispatching network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.766 2 WARNING nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received unexpected event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da for instance with vm_state active and task_state migrating.
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.766 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.767 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.767 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.767 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.768 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] No waiting events found dispatching network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.768 2 WARNING nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received unexpected event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da for instance with vm_state active and task_state migrating.
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.769 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.769 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.769 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.770 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.770 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] No waiting events found dispatching network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.771 2 WARNING nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received unexpected event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da for instance with vm_state active and task_state migrating.
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.771 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.771 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.772 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.772 2 DEBUG oslo_concurrency.lockutils [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.773 2 DEBUG nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] No waiting events found dispatching network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:22:51 compute-0 nova_compute[192567]: 2025-10-02 08:22:51.773 2 WARNING nova.compute.manager [req-0945a71f-822e-4dcb-a44e-ec7390ed9439 req-f03a87c4-e0df-4b0e-a29a-7caec7cc5d3b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Received unexpected event network-vif-plugged-9da9e2f3-1dbb-49c3-93ba-a8284319d5da for instance with vm_state active and task_state migrating.
Oct 02 08:22:55 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Oct 02 08:22:55 compute-0 systemd[220375]: Activating special unit Exit the Session...
Oct 02 08:22:55 compute-0 systemd[220375]: Stopped target Main User Target.
Oct 02 08:22:55 compute-0 systemd[220375]: Stopped target Basic System.
Oct 02 08:22:55 compute-0 systemd[220375]: Stopped target Paths.
Oct 02 08:22:55 compute-0 systemd[220375]: Stopped target Sockets.
Oct 02 08:22:55 compute-0 systemd[220375]: Stopped target Timers.
Oct 02 08:22:55 compute-0 systemd[220375]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 02 08:22:55 compute-0 systemd[220375]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 02 08:22:55 compute-0 systemd[220375]: Closed D-Bus User Message Bus Socket.
Oct 02 08:22:55 compute-0 systemd[220375]: Stopped Create User's Volatile Files and Directories.
Oct 02 08:22:55 compute-0 systemd[220375]: Removed slice User Application Slice.
Oct 02 08:22:55 compute-0 systemd[220375]: Reached target Shutdown.
Oct 02 08:22:55 compute-0 systemd[220375]: Finished Exit the Session.
Oct 02 08:22:55 compute-0 systemd[220375]: Reached target Exit the Session.
Oct 02 08:22:55 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Oct 02 08:22:55 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Oct 02 08:22:55 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct 02 08:22:55 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct 02 08:22:55 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct 02 08:22:55 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct 02 08:22:55 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Oct 02 08:22:56 compute-0 nova_compute[192567]: 2025-10-02 08:22:56.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:56 compute-0 nova_compute[192567]: 2025-10-02 08:22:56.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.120 2 DEBUG oslo_concurrency.lockutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.121 2 DEBUG oslo_concurrency.lockutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.121 2 DEBUG oslo_concurrency.lockutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "4215f34f-2fe2-47bc-8a97-5ebd0d9de473-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.151 2 DEBUG oslo_concurrency.lockutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.152 2 DEBUG oslo_concurrency.lockutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.153 2 DEBUG oslo_concurrency.lockutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.153 2 DEBUG nova.compute.resource_tracker [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.363 2 WARNING nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.365 2 DEBUG nova.compute.resource_tracker [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5888MB free_disk=73.46561431884766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.365 2 DEBUG oslo_concurrency.lockutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.365 2 DEBUG oslo_concurrency.lockutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.412 2 DEBUG nova.compute.resource_tracker [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Migration for instance 4215f34f-2fe2-47bc-8a97-5ebd0d9de473 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.436 2 DEBUG nova.compute.resource_tracker [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.501 2 DEBUG nova.compute.resource_tracker [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Migration 8e3d4b13-3ffe-4766-b46e-5601a638df6c is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.501 2 DEBUG nova.compute.resource_tracker [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.502 2 DEBUG nova.compute.resource_tracker [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.553 2 DEBUG nova.compute.provider_tree [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.573 2 DEBUG nova.scheduler.client.report [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.610 2 DEBUG nova.compute.resource_tracker [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.611 2 DEBUG oslo_concurrency.lockutils [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.618 2 INFO nova.compute.manager [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.727 2 INFO nova.scheduler.client.report [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Deleted allocation for migration 8e3d4b13-3ffe-4766-b46e-5601a638df6c
Oct 02 08:22:57 compute-0 nova_compute[192567]: 2025-10-02 08:22:57.728 2 DEBUG nova.virt.libvirt.driver [None req-7658fef2-ffab-47a6-9249-f0a7e52dff59 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Oct 02 08:22:59 compute-0 podman[203011]: time="2025-10-02T08:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:22:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:22:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3003 "" "Go-http-client/1.1"
Oct 02 08:23:01 compute-0 podman[220507]: 2025-10-02 08:23:01.161010581 +0000 UTC m=+0.065441896 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 02 08:23:01 compute-0 podman[220509]: 2025-10-02 08:23:01.192316509 +0000 UTC m=+0.081841513 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 02 08:23:01 compute-0 nova_compute[192567]: 2025-10-02 08:23:01.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:01 compute-0 podman[220515]: 2025-10-02 08:23:01.22657469 +0000 UTC m=+0.101177182 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible)
Oct 02 08:23:01 compute-0 podman[220508]: 2025-10-02 08:23:01.229610103 +0000 UTC m=+0.124030688 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 02 08:23:01 compute-0 openstack_network_exporter[205118]: ERROR   08:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:23:01 compute-0 openstack_network_exporter[205118]: ERROR   08:23:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:23:01 compute-0 openstack_network_exporter[205118]: ERROR   08:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:23:01 compute-0 openstack_network_exporter[205118]: ERROR   08:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:23:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:23:01 compute-0 openstack_network_exporter[205118]: ERROR   08:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:23:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:23:01 compute-0 nova_compute[192567]: 2025-10-02 08:23:01.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:04 compute-0 nova_compute[192567]: 2025-10-02 08:23:04.703 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759393369.701875, 4215f34f-2fe2-47bc-8a97-5ebd0d9de473 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:23:04 compute-0 nova_compute[192567]: 2025-10-02 08:23:04.704 2 INFO nova.compute.manager [-] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] VM Stopped (Lifecycle Event)
Oct 02 08:23:04 compute-0 nova_compute[192567]: 2025-10-02 08:23:04.730 2 DEBUG nova.compute.manager [None req-a89fde87-174a-4a21-a7ec-4e03bca48eba - - - - - -] [instance: 4215f34f-2fe2-47bc-8a97-5ebd0d9de473] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:23:06 compute-0 nova_compute[192567]: 2025-10-02 08:23:06.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:06 compute-0 nova_compute[192567]: 2025-10-02 08:23:06.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:07 compute-0 podman[220590]: 2025-10-02 08:23:07.172633408 +0000 UTC m=+0.079806270 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:23:11 compute-0 nova_compute[192567]: 2025-10-02 08:23:11.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:11 compute-0 nova_compute[192567]: 2025-10-02 08:23:11.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:16 compute-0 nova_compute[192567]: 2025-10-02 08:23:16.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:16 compute-0 nova_compute[192567]: 2025-10-02 08:23:16.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:20 compute-0 podman[220616]: 2025-10-02 08:23:20.166244108 +0000 UTC m=+0.078039586 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Oct 02 08:23:20 compute-0 nova_compute[192567]: 2025-10-02 08:23:20.756 2 DEBUG nova.compute.manager [None req-9fd5df38-60ab-4d02-863b-42f66cd6ba53 06fd0ba32e344f06ac22f27398df6fab a46cbd7217a541c58391886cae342f44 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Oct 02 08:23:20 compute-0 nova_compute[192567]: 2025-10-02 08:23:20.830 2 DEBUG nova.compute.provider_tree [None req-9fd5df38-60ab-4d02-863b-42f66cd6ba53 06fd0ba32e344f06ac22f27398df6fab a46cbd7217a541c58391886cae342f44 - - default default] Updating resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e generation from 19 to 22 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 02 08:23:21 compute-0 nova_compute[192567]: 2025-10-02 08:23:21.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:21 compute-0 nova_compute[192567]: 2025-10-02 08:23:21.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:26 compute-0 nova_compute[192567]: 2025-10-02 08:23:26.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:26 compute-0 nova_compute[192567]: 2025-10-02 08:23:26.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:29 compute-0 podman[203011]: time="2025-10-02T08:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:23:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:23:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3000 "" "Go-http-client/1.1"
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:31 compute-0 openstack_network_exporter[205118]: ERROR   08:23:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:23:31 compute-0 openstack_network_exporter[205118]: ERROR   08:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:23:31 compute-0 openstack_network_exporter[205118]: ERROR   08:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:23:31 compute-0 openstack_network_exporter[205118]: ERROR   08:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:23:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:23:31 compute-0 openstack_network_exporter[205118]: ERROR   08:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:23:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.654 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.654 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.655 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.655 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.905 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.906 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5885MB free_disk=73.46564865112305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.907 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.907 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.986 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:23:31 compute-0 nova_compute[192567]: 2025-10-02 08:23:31.987 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:23:32 compute-0 nova_compute[192567]: 2025-10-02 08:23:32.040 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:23:32 compute-0 nova_compute[192567]: 2025-10-02 08:23:32.074 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:23:32 compute-0 nova_compute[192567]: 2025-10-02 08:23:32.075 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:23:32 compute-0 nova_compute[192567]: 2025-10-02 08:23:32.075 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:23:32 compute-0 podman[220637]: 2025-10-02 08:23:32.171318092 +0000 UTC m=+0.086835498 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:23:32 compute-0 podman[220639]: 2025-10-02 08:23:32.185367196 +0000 UTC m=+0.090592054 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 02 08:23:32 compute-0 podman[220640]: 2025-10-02 08:23:32.207668476 +0000 UTC m=+0.100633114 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 02 08:23:32 compute-0 podman[220638]: 2025-10-02 08:23:32.234068964 +0000 UTC m=+0.138593769 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 02 08:23:32 compute-0 ovn_controller[94821]: 2025-10-02T08:23:32Z|00132|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 02 08:23:34 compute-0 nova_compute[192567]: 2025-10-02 08:23:34.075 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:23:34 compute-0 nova_compute[192567]: 2025-10-02 08:23:34.076 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:23:34 compute-0 nova_compute[192567]: 2025-10-02 08:23:34.076 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:23:34 compute-0 nova_compute[192567]: 2025-10-02 08:23:34.103 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:23:35 compute-0 nova_compute[192567]: 2025-10-02 08:23:35.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:23:35 compute-0 nova_compute[192567]: 2025-10-02 08:23:35.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:23:36 compute-0 nova_compute[192567]: 2025-10-02 08:23:36.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:36 compute-0 nova_compute[192567]: 2025-10-02 08:23:36.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:23:36 compute-0 nova_compute[192567]: 2025-10-02 08:23:36.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:37 compute-0 nova_compute[192567]: 2025-10-02 08:23:37.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:23:38 compute-0 podman[220717]: 2025-10-02 08:23:38.185642255 +0000 UTC m=+0.091873344 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:23:38 compute-0 nova_compute[192567]: 2025-10-02 08:23:38.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:23:40 compute-0 nova_compute[192567]: 2025-10-02 08:23:40.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:23:41 compute-0 nova_compute[192567]: 2025-10-02 08:23:41.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:41 compute-0 nova_compute[192567]: 2025-10-02 08:23:41.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:23:41 compute-0 nova_compute[192567]: 2025-10-02 08:23:41.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:23:41 compute-0 nova_compute[192567]: 2025-10-02 08:23:41.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:44 compute-0 nova_compute[192567]: 2025-10-02 08:23:44.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:23:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:45.980 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:23:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:45.981 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:23:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:45.981 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:23:46 compute-0 nova_compute[192567]: 2025-10-02 08:23:46.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:46 compute-0 nova_compute[192567]: 2025-10-02 08:23:46.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.474 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.475 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.496 2 DEBUG nova.compute.manager [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.582 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.583 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.594 2 DEBUG nova.virt.hardware [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.595 2 INFO nova.compute.claims [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.729 2 DEBUG nova.compute.provider_tree [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.747 2 DEBUG nova.scheduler.client.report [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.779 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.780 2 DEBUG nova.compute.manager [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.827 2 DEBUG nova.compute.manager [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.828 2 DEBUG nova.network.neutron [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.846 2 INFO nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.863 2 DEBUG nova.compute.manager [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.965 2 DEBUG nova.compute.manager [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.967 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.968 2 INFO nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Creating image(s)
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.969 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "/var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.970 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "/var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.972 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "/var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:23:49 compute-0 nova_compute[192567]: 2025-10-02 08:23:49.997 2 DEBUG oslo_concurrency.processutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.093 2 DEBUG oslo_concurrency.processutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.095 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.096 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.121 2 DEBUG oslo_concurrency.processutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.209 2 DEBUG oslo_concurrency.processutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.212 2 DEBUG oslo_concurrency.processutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.256 2 DEBUG oslo_concurrency.processutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.257 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.258 2 DEBUG oslo_concurrency.processutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.315 2 DEBUG oslo_concurrency.processutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.317 2 DEBUG nova.virt.disk.api [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Checking if we can resize image /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.318 2 DEBUG oslo_concurrency.processutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.391 2 DEBUG oslo_concurrency.processutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.392 2 DEBUG nova.virt.disk.api [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Cannot resize image /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.393 2 DEBUG nova.objects.instance [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'migration_context' on Instance uuid 8fa7a424-b570-4b4b-a24b-843ae1bfe666 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.409 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.409 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Ensure instance console log exists: /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.410 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.411 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.411 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:23:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:50.491 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:50.493 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:23:50 compute-0 nova_compute[192567]: 2025-10-02 08:23:50.594 2 DEBUG nova.network.neutron [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Successfully created port: 0b3c593e-59e3-49a0-b072-6632cc26ea8c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 02 08:23:51 compute-0 podman[220756]: 2025-10-02 08:23:51.171552696 +0000 UTC m=+0.083211436 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 02 08:23:51 compute-0 nova_compute[192567]: 2025-10-02 08:23:51.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:51 compute-0 nova_compute[192567]: 2025-10-02 08:23:51.346 2 DEBUG nova.network.neutron [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Successfully updated port: 0b3c593e-59e3-49a0-b072-6632cc26ea8c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 02 08:23:51 compute-0 nova_compute[192567]: 2025-10-02 08:23:51.361 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "refresh_cache-8fa7a424-b570-4b4b-a24b-843ae1bfe666" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:23:51 compute-0 nova_compute[192567]: 2025-10-02 08:23:51.361 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquired lock "refresh_cache-8fa7a424-b570-4b4b-a24b-843ae1bfe666" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:23:51 compute-0 nova_compute[192567]: 2025-10-02 08:23:51.362 2 DEBUG nova.network.neutron [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:23:51 compute-0 nova_compute[192567]: 2025-10-02 08:23:51.459 2 DEBUG nova.compute.manager [req-3e6465bc-b423-4b0d-814e-533405d52e7b req-ed618e44-aff9-4042-9010-1935c75c2bf4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-changed-0b3c593e-59e3-49a0-b072-6632cc26ea8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:23:51 compute-0 nova_compute[192567]: 2025-10-02 08:23:51.459 2 DEBUG nova.compute.manager [req-3e6465bc-b423-4b0d-814e-533405d52e7b req-ed618e44-aff9-4042-9010-1935c75c2bf4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Refreshing instance network info cache due to event network-changed-0b3c593e-59e3-49a0-b072-6632cc26ea8c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:23:51 compute-0 nova_compute[192567]: 2025-10-02 08:23:51.460 2 DEBUG oslo_concurrency.lockutils [req-3e6465bc-b423-4b0d-814e-533405d52e7b req-ed618e44-aff9-4042-9010-1935c75c2bf4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-8fa7a424-b570-4b4b-a24b-843ae1bfe666" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:23:51 compute-0 nova_compute[192567]: 2025-10-02 08:23:51.517 2 DEBUG nova.network.neutron [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:23:51 compute-0 nova_compute[192567]: 2025-10-02 08:23:51.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.073 2 DEBUG nova.network.neutron [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Updating instance_info_cache with network_info: [{"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.100 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Releasing lock "refresh_cache-8fa7a424-b570-4b4b-a24b-843ae1bfe666" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.101 2 DEBUG nova.compute.manager [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Instance network_info: |[{"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.102 2 DEBUG oslo_concurrency.lockutils [req-3e6465bc-b423-4b0d-814e-533405d52e7b req-ed618e44-aff9-4042-9010-1935c75c2bf4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-8fa7a424-b570-4b4b-a24b-843ae1bfe666" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.102 2 DEBUG nova.network.neutron [req-3e6465bc-b423-4b0d-814e-533405d52e7b req-ed618e44-aff9-4042-9010-1935c75c2bf4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Refreshing network info cache for port 0b3c593e-59e3-49a0-b072-6632cc26ea8c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.108 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Start _get_guest_xml network_info=[{"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.116 2 WARNING nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.121 2 DEBUG nova.virt.libvirt.host [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.122 2 DEBUG nova.virt.libvirt.host [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.126 2 DEBUG nova.virt.libvirt.host [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.128 2 DEBUG nova.virt.libvirt.host [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.128 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.129 2 DEBUG nova.virt.hardware [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.131 2 DEBUG nova.virt.hardware [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.131 2 DEBUG nova.virt.hardware [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.132 2 DEBUG nova.virt.hardware [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.132 2 DEBUG nova.virt.hardware [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.133 2 DEBUG nova.virt.hardware [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.133 2 DEBUG nova.virt.hardware [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.134 2 DEBUG nova.virt.hardware [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.135 2 DEBUG nova.virt.hardware [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.135 2 DEBUG nova.virt.hardware [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.136 2 DEBUG nova.virt.hardware [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.144 2 DEBUG nova.virt.libvirt.vif [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:23:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1316611014',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1316611014',id=16,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-535074ku',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:23:49Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=8fa7a424-b570-4b4b-a24b-843ae1bfe666,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.145 2 DEBUG nova.network.os_vif_util [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.146 2 DEBUG nova.network.os_vif_util [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:9f:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b3c593e-59e3-49a0-b072-6632cc26ea8c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3c593e-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.148 2 DEBUG nova.objects.instance [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'pci_devices' on Instance uuid 8fa7a424-b570-4b4b-a24b-843ae1bfe666 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.166 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:23:53 compute-0 nova_compute[192567]:   <uuid>8fa7a424-b570-4b4b-a24b-843ae1bfe666</uuid>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   <name>instance-00000010</name>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteStrategies-server-1316611014</nova:name>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:23:53</nova:creationTime>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:23:53 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:23:53 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:23:53 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:23:53 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:23:53 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:23:53 compute-0 nova_compute[192567]:         <nova:user uuid="bf38fbc8dd7b4c4db6c469a7951b0942">tempest-TestExecuteStrategies-1382092507-project-admin</nova:user>
Oct 02 08:23:53 compute-0 nova_compute[192567]:         <nova:project uuid="1ea832b474574009921dff909e4daeaf">tempest-TestExecuteStrategies-1382092507</nova:project>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:23:53 compute-0 nova_compute[192567]:         <nova:port uuid="0b3c593e-59e3-49a0-b072-6632cc26ea8c">
Oct 02 08:23:53 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <system>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <entry name="serial">8fa7a424-b570-4b4b-a24b-843ae1bfe666</entry>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <entry name="uuid">8fa7a424-b570-4b4b-a24b-843ae1bfe666</entry>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     </system>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   <os>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   </os>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   <features>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   </features>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk.config"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:ab:9f:fa"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <target dev="tap0b3c593e-59"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/console.log" append="off"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <video>
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     </video>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:23:53 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:23:53 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:23:53 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:23:53 compute-0 nova_compute[192567]: </domain>
Oct 02 08:23:53 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.167 2 DEBUG nova.compute.manager [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Preparing to wait for external event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.167 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.168 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.168 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.169 2 DEBUG nova.virt.libvirt.vif [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:23:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1316611014',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1316611014',id=16,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-535074ku',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:23:49Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=8fa7a424-b570-4b4b-a24b-843ae1bfe666,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.170 2 DEBUG nova.network.os_vif_util [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.171 2 DEBUG nova.network.os_vif_util [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:9f:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b3c593e-59e3-49a0-b072-6632cc26ea8c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3c593e-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.171 2 DEBUG os_vif [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:9f:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b3c593e-59e3-49a0-b072-6632cc26ea8c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3c593e-59') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.173 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.173 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.178 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b3c593e-59, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.179 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b3c593e-59, col_values=(('external_ids', {'iface-id': '0b3c593e-59e3-49a0-b072-6632cc26ea8c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:9f:fa', 'vm-uuid': '8fa7a424-b570-4b4b-a24b-843ae1bfe666'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:53 compute-0 NetworkManager[51654]: <info>  [1759393433.1825] manager: (tap0b3c593e-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.190 2 INFO os_vif [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:9f:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b3c593e-59e3-49a0-b072-6632cc26ea8c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3c593e-59')
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.278 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.279 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.279 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] No VIF found with MAC fa:16:3e:ab:9f:fa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:23:53 compute-0 nova_compute[192567]: 2025-10-02 08:23:53.280 2 INFO nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Using config drive
Oct 02 08:23:55 compute-0 nova_compute[192567]: 2025-10-02 08:23:55.070 2 INFO nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Creating config drive at /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk.config
Oct 02 08:23:55 compute-0 nova_compute[192567]: 2025-10-02 08:23:55.079 2 DEBUG oslo_concurrency.processutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplr1wwctt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:23:55 compute-0 nova_compute[192567]: 2025-10-02 08:23:55.206 2 DEBUG oslo_concurrency.processutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplr1wwctt" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:23:55 compute-0 kernel: tap0b3c593e-59: entered promiscuous mode
Oct 02 08:23:55 compute-0 NetworkManager[51654]: <info>  [1759393435.2953] manager: (tap0b3c593e-59): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Oct 02 08:23:55 compute-0 ovn_controller[94821]: 2025-10-02T08:23:55Z|00133|binding|INFO|Claiming lport 0b3c593e-59e3-49a0-b072-6632cc26ea8c for this chassis.
Oct 02 08:23:55 compute-0 ovn_controller[94821]: 2025-10-02T08:23:55Z|00134|binding|INFO|0b3c593e-59e3-49a0-b072-6632cc26ea8c: Claiming fa:16:3e:ab:9f:fa 10.100.0.4
Oct 02 08:23:55 compute-0 nova_compute[192567]: 2025-10-02 08:23:55.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.310 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:9f:fa 10.100.0.4'], port_security=['fa:16:3e:ab:9f:fa 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8fa7a424-b570-4b4b-a24b-843ae1bfe666', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=0b3c593e-59e3-49a0-b072-6632cc26ea8c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.312 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 0b3c593e-59e3-49a0-b072-6632cc26ea8c in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d bound to our chassis
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.314 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:23:55 compute-0 ovn_controller[94821]: 2025-10-02T08:23:55Z|00135|binding|INFO|Setting lport 0b3c593e-59e3-49a0-b072-6632cc26ea8c ovn-installed in OVS
Oct 02 08:23:55 compute-0 ovn_controller[94821]: 2025-10-02T08:23:55Z|00136|binding|INFO|Setting lport 0b3c593e-59e3-49a0-b072-6632cc26ea8c up in Southbound
Oct 02 08:23:55 compute-0 nova_compute[192567]: 2025-10-02 08:23:55.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:55 compute-0 nova_compute[192567]: 2025-10-02 08:23:55.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.335 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[812ca4e5-d65a-4b4f-96b9-1d9b409aafed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.337 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap08b16a0c-b1 in ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.339 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap08b16a0c-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.339 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[56c23c7c-a02a-47a7-89bd-1d8fe8cc5b46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.340 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[060bc506-514c-4a44-b11b-75ecfcae4e26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 systemd-udevd[220798]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.360 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[af0c2cbd-31ac-4306-83e6-ba10866024c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 systemd-machined[152597]: New machine qemu-12-instance-00000010.
Oct 02 08:23:55 compute-0 NetworkManager[51654]: <info>  [1759393435.3761] device (tap0b3c593e-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:23:55 compute-0 NetworkManager[51654]: <info>  [1759393435.3796] device (tap0b3c593e-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:23:55 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000010.
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.397 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[7db77ef3-518d-4e2d-a8fe-cb4dd42adf1d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.428 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[7373bb85-3838-4db4-9970-e37496330889]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.435 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[17069edc-1250-4774-9f0a-8293236ad229]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 NetworkManager[51654]: <info>  [1759393435.4381] manager: (tap08b16a0c-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.484 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[31366676-2992-44e2-82f0-071913a214d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.488 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[e22229a9-3aa4-4b27-98a9-a10c7db807f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.496 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:23:55 compute-0 NetworkManager[51654]: <info>  [1759393435.5279] device (tap08b16a0c-b0): carrier: link connected
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.538 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[08ec84f6-060c-4008-9f6c-53baba302e40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.567 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[6adcead4-1983-4abd-bbfc-07021b9224e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432311, 'reachable_time': 20672, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220836, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.594 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f32f9294-bb1e-489f-9f99-5de599c76228]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:c53f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 432311, 'tstamp': 432311}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220837, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.624 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8464b1bd-35b1-4905-a7c6-342ec1b4d138]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432311, 'reachable_time': 20672, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220839, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.661 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac15cf2-5773-4dce-8b1b-b51b5df4597b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.721 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[beded1df-6aa3-43bd-8d48-27858f94c778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.723 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.723 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.724 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b16a0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:23:55 compute-0 kernel: tap08b16a0c-b0: entered promiscuous mode
Oct 02 08:23:55 compute-0 NetworkManager[51654]: <info>  [1759393435.7265] manager: (tap08b16a0c-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Oct 02 08:23:55 compute-0 nova_compute[192567]: 2025-10-02 08:23:55.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.730 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08b16a0c-b0, col_values=(('external_ids', {'iface-id': '748eef31-77a8-4b04-b6b7-dc0f7cc1cf65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:23:55 compute-0 nova_compute[192567]: 2025-10-02 08:23:55.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:55 compute-0 ovn_controller[94821]: 2025-10-02T08:23:55Z|00137|binding|INFO|Releasing lport 748eef31-77a8-4b04-b6b7-dc0f7cc1cf65 from this chassis (sb_readonly=0)
Oct 02 08:23:55 compute-0 nova_compute[192567]: 2025-10-02 08:23:55.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:55 compute-0 nova_compute[192567]: 2025-10-02 08:23:55.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.759 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.760 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[7bb53ead-ac73-44bd-bab6-2fe4efee7dbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.761 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:23:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:23:55.762 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'env', 'PROCESS_TAG=haproxy-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/08b16a0c-b69f-4a34-9bfe-830099adfe8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.102 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393436.101211, 8fa7a424-b570-4b4b-a24b-843ae1bfe666 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.103 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] VM Started (Lifecycle Event)
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.127 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.133 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393436.1020873, 8fa7a424-b570-4b4b-a24b-843ae1bfe666 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.134 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] VM Paused (Lifecycle Event)
Oct 02 08:23:56 compute-0 podman[220871]: 2025-10-02 08:23:56.161472502 +0000 UTC m=+0.065890269 container create ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.165 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.172 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.201 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.216 2 DEBUG nova.compute.manager [req-5f2cc340-443e-4be8-be80-4a802815d916 req-2a8ea1ee-a558-4bdc-a4c8-c06ed9bc0b06 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.217 2 DEBUG oslo_concurrency.lockutils [req-5f2cc340-443e-4be8-be80-4a802815d916 req-2a8ea1ee-a558-4bdc-a4c8-c06ed9bc0b06 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.218 2 DEBUG oslo_concurrency.lockutils [req-5f2cc340-443e-4be8-be80-4a802815d916 req-2a8ea1ee-a558-4bdc-a4c8-c06ed9bc0b06 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:23:56 compute-0 podman[220871]: 2025-10-02 08:23:56.127405018 +0000 UTC m=+0.031822785 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.218 2 DEBUG oslo_concurrency.lockutils [req-5f2cc340-443e-4be8-be80-4a802815d916 req-2a8ea1ee-a558-4bdc-a4c8-c06ed9bc0b06 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.219 2 DEBUG nova.compute.manager [req-5f2cc340-443e-4be8-be80-4a802815d916 req-2a8ea1ee-a558-4bdc-a4c8-c06ed9bc0b06 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Processing event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.221 2 DEBUG nova.compute.manager [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:23:56 compute-0 systemd[1]: Started libpod-conmon-ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95.scope.
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.226 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393436.2260299, 8fa7a424-b570-4b4b-a24b-843ae1bfe666 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.227 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] VM Resumed (Lifecycle Event)
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.236 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.241 2 INFO nova.virt.libvirt.driver [-] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Instance spawned successfully.
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.242 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.264 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:23:56 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.274 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:23:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f1fd761fe1a881cf1cf21a900caca1fe45c1795610b49de48ec1894ccf0f984/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.280 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.281 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.282 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.283 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.283 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.284 2 DEBUG nova.virt.libvirt.driver [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:23:56 compute-0 podman[220871]: 2025-10-02 08:23:56.292467465 +0000 UTC m=+0.196885232 container init ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.299 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:23:56 compute-0 podman[220871]: 2025-10-02 08:23:56.304590301 +0000 UTC m=+0.209008038 container start ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.346 2 INFO nova.compute.manager [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Took 6.38 seconds to spawn the instance on the hypervisor.
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.346 2 DEBUG nova.compute.manager [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:23:56 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220888]: [NOTICE]   (220892) : New worker (220894) forked
Oct 02 08:23:56 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220888]: [NOTICE]   (220892) : Loading success.
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.420 2 INFO nova.compute.manager [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Took 6.88 seconds to build instance.
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.440 2 DEBUG oslo_concurrency.lockutils [None req-6d28ca65-96e5-4381-b4e5-8b3dd6dd8456 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.965s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:23:56 compute-0 nova_compute[192567]: 2025-10-02 08:23:56.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:57 compute-0 nova_compute[192567]: 2025-10-02 08:23:57.341 2 DEBUG nova.network.neutron [req-3e6465bc-b423-4b0d-814e-533405d52e7b req-ed618e44-aff9-4042-9010-1935c75c2bf4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Updated VIF entry in instance network info cache for port 0b3c593e-59e3-49a0-b072-6632cc26ea8c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:23:57 compute-0 nova_compute[192567]: 2025-10-02 08:23:57.342 2 DEBUG nova.network.neutron [req-3e6465bc-b423-4b0d-814e-533405d52e7b req-ed618e44-aff9-4042-9010-1935c75c2bf4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Updating instance_info_cache with network_info: [{"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:23:57 compute-0 nova_compute[192567]: 2025-10-02 08:23:57.372 2 DEBUG oslo_concurrency.lockutils [req-3e6465bc-b423-4b0d-814e-533405d52e7b req-ed618e44-aff9-4042-9010-1935c75c2bf4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-8fa7a424-b570-4b4b-a24b-843ae1bfe666" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:23:58 compute-0 nova_compute[192567]: 2025-10-02 08:23:58.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:23:58 compute-0 nova_compute[192567]: 2025-10-02 08:23:58.344 2 DEBUG nova.compute.manager [req-a384ebf8-e5d9-40fa-b14c-e19ca064daf4 req-79a3d662-2ce3-44f5-8953-a1d6eb43a54e 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:23:58 compute-0 nova_compute[192567]: 2025-10-02 08:23:58.344 2 DEBUG oslo_concurrency.lockutils [req-a384ebf8-e5d9-40fa-b14c-e19ca064daf4 req-79a3d662-2ce3-44f5-8953-a1d6eb43a54e 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:23:58 compute-0 nova_compute[192567]: 2025-10-02 08:23:58.345 2 DEBUG oslo_concurrency.lockutils [req-a384ebf8-e5d9-40fa-b14c-e19ca064daf4 req-79a3d662-2ce3-44f5-8953-a1d6eb43a54e 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:23:58 compute-0 nova_compute[192567]: 2025-10-02 08:23:58.345 2 DEBUG oslo_concurrency.lockutils [req-a384ebf8-e5d9-40fa-b14c-e19ca064daf4 req-79a3d662-2ce3-44f5-8953-a1d6eb43a54e 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:23:58 compute-0 nova_compute[192567]: 2025-10-02 08:23:58.346 2 DEBUG nova.compute.manager [req-a384ebf8-e5d9-40fa-b14c-e19ca064daf4 req-79a3d662-2ce3-44f5-8953-a1d6eb43a54e 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] No waiting events found dispatching network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:23:58 compute-0 nova_compute[192567]: 2025-10-02 08:23:58.346 2 WARNING nova.compute.manager [req-a384ebf8-e5d9-40fa-b14c-e19ca064daf4 req-79a3d662-2ce3-44f5-8953-a1d6eb43a54e 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received unexpected event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c for instance with vm_state active and task_state None.
Oct 02 08:23:59 compute-0 podman[203011]: time="2025-10-02T08:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:23:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:23:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3460 "" "Go-http-client/1.1"
Oct 02 08:24:01 compute-0 openstack_network_exporter[205118]: ERROR   08:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:24:01 compute-0 openstack_network_exporter[205118]: ERROR   08:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:24:01 compute-0 openstack_network_exporter[205118]: ERROR   08:24:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:24:01 compute-0 openstack_network_exporter[205118]: ERROR   08:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:24:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:24:01 compute-0 openstack_network_exporter[205118]: ERROR   08:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:24:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:24:01 compute-0 nova_compute[192567]: 2025-10-02 08:24:01.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:03 compute-0 podman[220903]: 2025-10-02 08:24:03.183027667 +0000 UTC m=+0.081720679 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 02 08:24:03 compute-0 nova_compute[192567]: 2025-10-02 08:24:03.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:03 compute-0 podman[220906]: 2025-10-02 08:24:03.194035918 +0000 UTC m=+0.085629510 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 02 08:24:03 compute-0 podman[220905]: 2025-10-02 08:24:03.195597957 +0000 UTC m=+0.090078649 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible)
Oct 02 08:24:03 compute-0 podman[220904]: 2025-10-02 08:24:03.231410464 +0000 UTC m=+0.120672004 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:24:06 compute-0 nova_compute[192567]: 2025-10-02 08:24:06.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:07 compute-0 ovn_controller[94821]: 2025-10-02T08:24:07Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ab:9f:fa 10.100.0.4
Oct 02 08:24:07 compute-0 ovn_controller[94821]: 2025-10-02T08:24:07Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:9f:fa 10.100.0.4
Oct 02 08:24:08 compute-0 nova_compute[192567]: 2025-10-02 08:24:08.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:09 compute-0 podman[221002]: 2025-10-02 08:24:09.178839167 +0000 UTC m=+0.081211755 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:24:11 compute-0 nova_compute[192567]: 2025-10-02 08:24:11.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:13 compute-0 nova_compute[192567]: 2025-10-02 08:24:13.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:16 compute-0 nova_compute[192567]: 2025-10-02 08:24:16.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:18 compute-0 nova_compute[192567]: 2025-10-02 08:24:18.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:21 compute-0 nova_compute[192567]: 2025-10-02 08:24:21.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:22 compute-0 podman[221028]: 2025-10-02 08:24:22.165898003 +0000 UTC m=+0.080992217 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 02 08:24:22 compute-0 nova_compute[192567]: 2025-10-02 08:24:22.368 2 DEBUG nova.compute.manager [None req-75984296-7209-4082-ac7f-d4a15be498dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Oct 02 08:24:22 compute-0 nova_compute[192567]: 2025-10-02 08:24:22.415 2 DEBUG nova.compute.provider_tree [None req-75984296-7209-4082-ac7f-d4a15be498dc f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Updating resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e generation from 22 to 24 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 02 08:24:23 compute-0 nova_compute[192567]: 2025-10-02 08:24:23.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:25 compute-0 ovn_controller[94821]: 2025-10-02T08:24:25Z|00138|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct 02 08:24:26 compute-0 nova_compute[192567]: 2025-10-02 08:24:26.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:28 compute-0 nova_compute[192567]: 2025-10-02 08:24:28.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:28 compute-0 nova_compute[192567]: 2025-10-02 08:24:28.787 2 DEBUG nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Check if temp file /var/lib/nova/instances/tmpkrrl14vi exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Oct 02 08:24:28 compute-0 nova_compute[192567]: 2025-10-02 08:24:28.787 2 DEBUG nova.compute.manager [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkrrl14vi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8fa7a424-b570-4b4b-a24b-843ae1bfe666',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Oct 02 08:24:29 compute-0 nova_compute[192567]: 2025-10-02 08:24:29.509 2 DEBUG oslo_concurrency.processutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:24:29 compute-0 nova_compute[192567]: 2025-10-02 08:24:29.606 2 DEBUG oslo_concurrency.processutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:24:29 compute-0 nova_compute[192567]: 2025-10-02 08:24:29.608 2 DEBUG oslo_concurrency.processutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:24:29 compute-0 nova_compute[192567]: 2025-10-02 08:24:29.674 2 DEBUG oslo_concurrency.processutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:24:29 compute-0 podman[203011]: time="2025-10-02T08:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:24:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:24:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3468 "" "Go-http-client/1.1"
Oct 02 08:24:31 compute-0 openstack_network_exporter[205118]: ERROR   08:24:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:24:31 compute-0 openstack_network_exporter[205118]: ERROR   08:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:24:31 compute-0 openstack_network_exporter[205118]: ERROR   08:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:24:31 compute-0 openstack_network_exporter[205118]: ERROR   08:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:24:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:24:31 compute-0 openstack_network_exporter[205118]: ERROR   08:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:24:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:24:31 compute-0 sshd-session[221057]: Accepted publickey for nova from 192.168.122.101 port 37242 ssh2: ECDSA SHA256:nyj9easCU2+zJyxXdAvgdE/0ePVxCLkFf7X2/rv3WZg
Oct 02 08:24:31 compute-0 systemd-logind[827]: New session 39 of user nova.
Oct 02 08:24:31 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Oct 02 08:24:31 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct 02 08:24:31 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct 02 08:24:31 compute-0 systemd[1]: Starting User Manager for UID 42436...
Oct 02 08:24:31 compute-0 systemd[221061]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:24:31 compute-0 nova_compute[192567]: 2025-10-02 08:24:31.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:31 compute-0 systemd[221061]: Queued start job for default target Main User Target.
Oct 02 08:24:31 compute-0 systemd[221061]: Created slice User Application Slice.
Oct 02 08:24:31 compute-0 systemd[221061]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 02 08:24:31 compute-0 systemd[221061]: Started Daily Cleanup of User's Temporary Directories.
Oct 02 08:24:31 compute-0 systemd[221061]: Reached target Paths.
Oct 02 08:24:31 compute-0 systemd[221061]: Reached target Timers.
Oct 02 08:24:31 compute-0 systemd[221061]: Starting D-Bus User Message Bus Socket...
Oct 02 08:24:31 compute-0 systemd[221061]: Starting Create User's Volatile Files and Directories...
Oct 02 08:24:31 compute-0 systemd[221061]: Finished Create User's Volatile Files and Directories.
Oct 02 08:24:31 compute-0 systemd[221061]: Listening on D-Bus User Message Bus Socket.
Oct 02 08:24:31 compute-0 systemd[221061]: Reached target Sockets.
Oct 02 08:24:31 compute-0 systemd[221061]: Reached target Basic System.
Oct 02 08:24:31 compute-0 systemd[221061]: Reached target Main User Target.
Oct 02 08:24:31 compute-0 systemd[221061]: Startup finished in 186ms.
Oct 02 08:24:31 compute-0 systemd[1]: Started User Manager for UID 42436.
Oct 02 08:24:31 compute-0 systemd[1]: Started Session 39 of User nova.
Oct 02 08:24:31 compute-0 sshd-session[221057]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:24:31 compute-0 sshd-session[221076]: Received disconnect from 192.168.122.101 port 37242:11: disconnected by user
Oct 02 08:24:31 compute-0 sshd-session[221076]: Disconnected from user nova 192.168.122.101 port 37242
Oct 02 08:24:31 compute-0 sshd-session[221057]: pam_unix(sshd:session): session closed for user nova
Oct 02 08:24:31 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Oct 02 08:24:31 compute-0 systemd-logind[827]: Session 39 logged out. Waiting for processes to exit.
Oct 02 08:24:31 compute-0 systemd-logind[827]: Removed session 39.
Oct 02 08:24:33 compute-0 nova_compute[192567]: 2025-10-02 08:24:33.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:33 compute-0 nova_compute[192567]: 2025-10-02 08:24:33.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:24:33 compute-0 nova_compute[192567]: 2025-10-02 08:24:33.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:24:33 compute-0 nova_compute[192567]: 2025-10-02 08:24:33.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:24:33 compute-0 nova_compute[192567]: 2025-10-02 08:24:33.657 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-8fa7a424-b570-4b4b-a24b-843ae1bfe666" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:24:33 compute-0 nova_compute[192567]: 2025-10-02 08:24:33.657 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-8fa7a424-b570-4b4b-a24b-843ae1bfe666" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:24:33 compute-0 nova_compute[192567]: 2025-10-02 08:24:33.658 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:24:33 compute-0 nova_compute[192567]: 2025-10-02 08:24:33.658 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8fa7a424-b570-4b4b-a24b-843ae1bfe666 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:24:34 compute-0 podman[221078]: 2025-10-02 08:24:34.200616025 +0000 UTC m=+0.102162642 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 02 08:24:34 compute-0 podman[221081]: 2025-10-02 08:24:34.203624658 +0000 UTC m=+0.094755953 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:24:34 compute-0 podman[221080]: 2025-10-02 08:24:34.203710371 +0000 UTC m=+0.097300772 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:24:34 compute-0 podman[221079]: 2025-10-02 08:24:34.261111126 +0000 UTC m=+0.159479525 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:24:34 compute-0 nova_compute[192567]: 2025-10-02 08:24:34.341 2 DEBUG nova.compute.manager [req-b5adf6b9-75ea-45ce-96ea-0744ddcda56b req-6e875d69-9f3a-4fcb-84fb-c8131344d383 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-unplugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:24:34 compute-0 nova_compute[192567]: 2025-10-02 08:24:34.342 2 DEBUG oslo_concurrency.lockutils [req-b5adf6b9-75ea-45ce-96ea-0744ddcda56b req-6e875d69-9f3a-4fcb-84fb-c8131344d383 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:34 compute-0 nova_compute[192567]: 2025-10-02 08:24:34.342 2 DEBUG oslo_concurrency.lockutils [req-b5adf6b9-75ea-45ce-96ea-0744ddcda56b req-6e875d69-9f3a-4fcb-84fb-c8131344d383 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:34 compute-0 nova_compute[192567]: 2025-10-02 08:24:34.342 2 DEBUG oslo_concurrency.lockutils [req-b5adf6b9-75ea-45ce-96ea-0744ddcda56b req-6e875d69-9f3a-4fcb-84fb-c8131344d383 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:34 compute-0 nova_compute[192567]: 2025-10-02 08:24:34.343 2 DEBUG nova.compute.manager [req-b5adf6b9-75ea-45ce-96ea-0744ddcda56b req-6e875d69-9f3a-4fcb-84fb-c8131344d383 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] No waiting events found dispatching network-vif-unplugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:24:34 compute-0 nova_compute[192567]: 2025-10-02 08:24:34.343 2 DEBUG nova.compute.manager [req-b5adf6b9-75ea-45ce-96ea-0744ddcda56b req-6e875d69-9f3a-4fcb-84fb-c8131344d383 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-unplugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:24:35 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:35.094 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:24:35 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:35.096 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:24:35 compute-0 nova_compute[192567]: 2025-10-02 08:24:35.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:36 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:36.098 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.364 2 INFO nova.compute.manager [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Took 6.69 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.364 2 DEBUG nova.compute.manager [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.385 2 DEBUG nova.compute.manager [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkrrl14vi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8fa7a424-b570-4b4b-a24b-843ae1bfe666',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(894c372e-bf1d-4659-a110-23dab2185368),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.409 2 DEBUG nova.objects.instance [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 8fa7a424-b570-4b4b-a24b-843ae1bfe666 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.410 2 DEBUG nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.412 2 DEBUG nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.412 2 DEBUG nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.423 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Updating instance_info_cache with network_info: [{"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.429 2 DEBUG nova.compute.manager [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.429 2 DEBUG oslo_concurrency.lockutils [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.430 2 DEBUG oslo_concurrency.lockutils [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.430 2 DEBUG oslo_concurrency.lockutils [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.430 2 DEBUG nova.compute.manager [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] No waiting events found dispatching network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.431 2 WARNING nova.compute.manager [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received unexpected event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c for instance with vm_state active and task_state migrating.
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.431 2 DEBUG nova.compute.manager [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-changed-0b3c593e-59e3-49a0-b072-6632cc26ea8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.431 2 DEBUG nova.compute.manager [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Refreshing instance network info cache due to event network-changed-0b3c593e-59e3-49a0-b072-6632cc26ea8c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.431 2 DEBUG oslo_concurrency.lockutils [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-8fa7a424-b570-4b4b-a24b-843ae1bfe666" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.434 2 DEBUG nova.virt.libvirt.vif [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:23:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1316611014',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1316611014',id=16,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:23:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-535074ku',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:23:56Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=8fa7a424-b570-4b4b-a24b-843ae1bfe666,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.434 2 DEBUG nova.network.os_vif_util [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.435 2 DEBUG nova.network.os_vif_util [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:9f:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b3c593e-59e3-49a0-b072-6632cc26ea8c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3c593e-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.436 2 DEBUG nova.virt.libvirt.migration [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Updating guest XML with vif config: <interface type="ethernet">
Oct 02 08:24:36 compute-0 nova_compute[192567]:   <mac address="fa:16:3e:ab:9f:fa"/>
Oct 02 08:24:36 compute-0 nova_compute[192567]:   <model type="virtio"/>
Oct 02 08:24:36 compute-0 nova_compute[192567]:   <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:24:36 compute-0 nova_compute[192567]:   <mtu size="1442"/>
Oct 02 08:24:36 compute-0 nova_compute[192567]:   <target dev="tap0b3c593e-59"/>
Oct 02 08:24:36 compute-0 nova_compute[192567]: </interface>
Oct 02 08:24:36 compute-0 nova_compute[192567]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.436 2 DEBUG nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.444 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-8fa7a424-b570-4b4b-a24b-843ae1bfe666" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.445 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.445 2 DEBUG oslo_concurrency.lockutils [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-8fa7a424-b570-4b4b-a24b-843ae1bfe666" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.445 2 DEBUG nova.network.neutron [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Refreshing network info cache for port 0b3c593e-59e3-49a0-b072-6632cc26ea8c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.446 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.475 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.476 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.477 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.477 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.556 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.649 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.651 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.726 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.914 2 DEBUG nova.virt.libvirt.migration [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:24:36 compute-0 nova_compute[192567]: 2025-10-02 08:24:36.915 2 INFO nova.virt.libvirt.migration [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Increasing downtime to 50 ms after 0 sec elapsed time
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.006 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.008 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5706MB free_disk=73.43631362915039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.009 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.009 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.016 2 INFO nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.084 2 INFO nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Updating resource usage from migration 894c372e-bf1d-4659-a110-23dab2185368
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.120 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Migration 894c372e-bf1d-4659-a110-23dab2185368 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.121 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.121 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.187 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.206 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.236 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.237 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.519 2 DEBUG nova.virt.libvirt.migration [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:24:37 compute-0 nova_compute[192567]: 2025-10-02 08:24:37.519 2 DEBUG nova.virt.libvirt.migration [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.029 2 DEBUG nova.virt.libvirt.migration [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.030 2 DEBUG nova.virt.libvirt.migration [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.319 2 DEBUG nova.network.neutron [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Updated VIF entry in instance network info cache for port 0b3c593e-59e3-49a0-b072-6632cc26ea8c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.320 2 DEBUG nova.network.neutron [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Updating instance_info_cache with network_info: [{"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.341 2 DEBUG oslo_concurrency.lockutils [req-5f20595f-b61d-46c5-8070-914f94d541a2 req-31f52a3d-c18f-4458-baab-4e28980388a7 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-8fa7a424-b570-4b4b-a24b-843ae1bfe666" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.415 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.416 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.416 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.417 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.447 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393478.44662, 8fa7a424-b570-4b4b-a24b-843ae1bfe666 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.447 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] VM Paused (Lifecycle Event)
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.470 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.477 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.501 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] During sync_power_state the instance has a pending task (migrating). Skip.
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.534 2 DEBUG nova.virt.libvirt.migration [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.534 2 DEBUG nova.virt.libvirt.migration [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:24:38 compute-0 kernel: tap0b3c593e-59 (unregistering): left promiscuous mode
Oct 02 08:24:38 compute-0 NetworkManager[51654]: <info>  [1759393478.6073] device (tap0b3c593e-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:24:38 compute-0 ovn_controller[94821]: 2025-10-02T08:24:38Z|00139|binding|INFO|Releasing lport 0b3c593e-59e3-49a0-b072-6632cc26ea8c from this chassis (sb_readonly=0)
Oct 02 08:24:38 compute-0 ovn_controller[94821]: 2025-10-02T08:24:38Z|00140|binding|INFO|Setting lport 0b3c593e-59e3-49a0-b072-6632cc26ea8c down in Southbound
Oct 02 08:24:38 compute-0 ovn_controller[94821]: 2025-10-02T08:24:38Z|00141|binding|INFO|Removing iface tap0b3c593e-59 ovn-installed in OVS
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:38 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:38.668 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:9f:fa 10.100.0.4'], port_security=['fa:16:3e:ab:9f:fa 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '61f597a0-da80-455c-aab0-956a1e15f143'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8fa7a424-b570-4b4b-a24b-843ae1bfe666', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=0b3c593e-59e3-49a0-b072-6632cc26ea8c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:24:38 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:38.671 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 0b3c593e-59e3-49a0-b072-6632cc26ea8c in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d unbound from our chassis
Oct 02 08:24:38 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:38.672 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:24:38 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:38.674 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8c4b512a-a9dd-44ad-9357-604fe9f71e8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:24:38 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:38.675 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d namespace which is not needed anymore
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:38 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000010.scope: Deactivated successfully.
Oct 02 08:24:38 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000010.scope: Consumed 13.990s CPU time.
Oct 02 08:24:38 compute-0 systemd-machined[152597]: Machine qemu-12-instance-00000010 terminated.
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.850 2 DEBUG nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.851 2 DEBUG nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Oct 02 08:24:38 compute-0 nova_compute[192567]: 2025-10-02 08:24:38.852 2 DEBUG nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Oct 02 08:24:38 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220888]: [NOTICE]   (220892) : haproxy version is 2.8.14-c23fe91
Oct 02 08:24:38 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220888]: [NOTICE]   (220892) : path to executable is /usr/sbin/haproxy
Oct 02 08:24:38 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220888]: [WARNING]  (220892) : Exiting Master process...
Oct 02 08:24:38 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220888]: [WARNING]  (220892) : Exiting Master process...
Oct 02 08:24:38 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220888]: [ALERT]    (220892) : Current worker (220894) exited with code 143 (Terminated)
Oct 02 08:24:38 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[220888]: [WARNING]  (220892) : All workers exited. Exiting... (0)
Oct 02 08:24:38 compute-0 systemd[1]: libpod-ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95.scope: Deactivated successfully.
Oct 02 08:24:38 compute-0 conmon[220888]: conmon ee7074afb8dcfe233737 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95.scope/container/memory.events
Oct 02 08:24:38 compute-0 podman[221215]: 2025-10-02 08:24:38.882488031 +0000 UTC m=+0.056565692 container died ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct 02 08:24:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95-userdata-shm.mount: Deactivated successfully.
Oct 02 08:24:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-9f1fd761fe1a881cf1cf21a900caca1fe45c1795610b49de48ec1894ccf0f984-merged.mount: Deactivated successfully.
Oct 02 08:24:38 compute-0 podman[221215]: 2025-10-02 08:24:38.920034402 +0000 UTC m=+0.094112053 container cleanup ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 02 08:24:38 compute-0 systemd[1]: libpod-conmon-ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95.scope: Deactivated successfully.
Oct 02 08:24:38 compute-0 podman[221250]: 2025-10-02 08:24:38.989265784 +0000 UTC m=+0.045522029 container remove ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 02 08:24:38 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:38.995 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[81786167-5940-441d-abb4-5c509fcc0fa0]: (4, ('Thu Oct  2 08:24:38 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d (ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95)\nee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95\nThu Oct  2 08:24:38 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d (ee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95)\nee7074afb8dcfe2337373a1d5e8d614cff1f0f0302227a6fa36fc34f0deb5a95\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:24:38 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:38.997 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[9c403328-74e9-43e1-8de5-1a6796a56369]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:24:38 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:38.998 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:39 compute-0 kernel: tap08b16a0c-b0: left promiscuous mode
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:39.030 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[5a7a2690-175c-48dd-8452-22adbff549ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.038 2 DEBUG nova.virt.libvirt.guest [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '8fa7a424-b570-4b4b-a24b-843ae1bfe666' (instance-00000010) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.039 2 INFO nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Migration operation has completed
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.040 2 INFO nova.compute.manager [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] _post_live_migration() is started..
Oct 02 08:24:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:39.054 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[c2735f8e-8864-4c51-bb13-ae0b001716a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:24:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:39.056 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[cb21dfdc-b1a8-4a75-8cc0-4b4a0bc8a548]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:24:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:39.081 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[3941091f-d521-48e8-9e95-a34366678e19]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 432301, 'reachable_time': 42345, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221268, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:24:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:39.084 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:24:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:39.084 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[e303bbaf-30af-4a2e-afce-7cebd46ffb1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:24:39 compute-0 systemd[1]: run-netns-ovnmeta\x2d08b16a0c\x2db69f\x2d4a34\x2d9bfe\x2d830099adfe8d.mount: Deactivated successfully.
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.262 2 DEBUG nova.compute.manager [req-9de8bd90-1f00-4684-88cc-45000df506b2 req-481d57ab-ae6f-4d1b-b997-c4ce0b383db6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-unplugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.263 2 DEBUG oslo_concurrency.lockutils [req-9de8bd90-1f00-4684-88cc-45000df506b2 req-481d57ab-ae6f-4d1b-b997-c4ce0b383db6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.263 2 DEBUG oslo_concurrency.lockutils [req-9de8bd90-1f00-4684-88cc-45000df506b2 req-481d57ab-ae6f-4d1b-b997-c4ce0b383db6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.264 2 DEBUG oslo_concurrency.lockutils [req-9de8bd90-1f00-4684-88cc-45000df506b2 req-481d57ab-ae6f-4d1b-b997-c4ce0b383db6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.264 2 DEBUG nova.compute.manager [req-9de8bd90-1f00-4684-88cc-45000df506b2 req-481d57ab-ae6f-4d1b-b997-c4ce0b383db6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] No waiting events found dispatching network-vif-unplugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.265 2 DEBUG nova.compute.manager [req-9de8bd90-1f00-4684-88cc-45000df506b2 req-481d57ab-ae6f-4d1b-b997-c4ce0b383db6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-unplugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:24:39 compute-0 nova_compute[192567]: 2025-10-02 08:24:39.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.181 2 DEBUG nova.network.neutron [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Activated binding for port 0b3c593e-59e3-49a0-b072-6632cc26ea8c and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.182 2 DEBUG nova.compute.manager [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.183 2 DEBUG nova.virt.libvirt.vif [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:23:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1316611014',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1316611014',id=16,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:23:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-535074ku',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:24:27Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=8fa7a424-b570-4b4b-a24b-843ae1bfe666,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.184 2 DEBUG nova.network.os_vif_util [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "address": "fa:16:3e:ab:9f:fa", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b3c593e-59", "ovs_interfaceid": "0b3c593e-59e3-49a0-b072-6632cc26ea8c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.185 2 DEBUG nova.network.os_vif_util [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:9f:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b3c593e-59e3-49a0-b072-6632cc26ea8c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3c593e-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.186 2 DEBUG os_vif [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:9f:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b3c593e-59e3-49a0-b072-6632cc26ea8c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3c593e-59') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:24:40 compute-0 podman[221269]: 2025-10-02 08:24:40.185896288 +0000 UTC m=+0.094639239 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.189 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b3c593e-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.197 2 INFO os_vif [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:9f:fa,bridge_name='br-int',has_traffic_filtering=True,id=0b3c593e-59e3-49a0-b072-6632cc26ea8c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b3c593e-59')
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.198 2 DEBUG oslo_concurrency.lockutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.198 2 DEBUG oslo_concurrency.lockutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.199 2 DEBUG oslo_concurrency.lockutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.199 2 DEBUG nova.compute.manager [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.200 2 INFO nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Deleting instance files /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666_del
Oct 02 08:24:40 compute-0 nova_compute[192567]: 2025-10-02 08:24:40.201 2 INFO nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Deletion of /var/lib/nova/instances/8fa7a424-b570-4b4b-a24b-843ae1bfe666_del complete
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.396 2 DEBUG nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.397 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.397 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.398 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.398 2 DEBUG nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] No waiting events found dispatching network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.399 2 WARNING nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received unexpected event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c for instance with vm_state active and task_state migrating.
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.399 2 DEBUG nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.399 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.400 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.401 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.401 2 DEBUG nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] No waiting events found dispatching network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.402 2 WARNING nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received unexpected event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c for instance with vm_state active and task_state migrating.
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.402 2 DEBUG nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-unplugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.402 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.403 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.403 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.403 2 DEBUG nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] No waiting events found dispatching network-vif-unplugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.404 2 DEBUG nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-unplugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.404 2 DEBUG nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.405 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.405 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.405 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.406 2 DEBUG nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] No waiting events found dispatching network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.406 2 WARNING nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received unexpected event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c for instance with vm_state active and task_state migrating.
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.406 2 DEBUG nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.407 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.407 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.407 2 DEBUG oslo_concurrency.lockutils [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.408 2 DEBUG nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] No waiting events found dispatching network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.408 2 WARNING nova.compute.manager [req-3773c5d2-0858-47b9-8ed3-d35729ce1ee6 req-33338987-66b3-4509-a0f2-8fea0758ee93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Received unexpected event network-vif-plugged-0b3c593e-59e3-49a0-b072-6632cc26ea8c for instance with vm_state active and task_state migrating.
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:24:41 compute-0 nova_compute[192567]: 2025-10-02 08:24:41.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:42 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Oct 02 08:24:42 compute-0 systemd[221061]: Activating special unit Exit the Session...
Oct 02 08:24:42 compute-0 systemd[221061]: Stopped target Main User Target.
Oct 02 08:24:42 compute-0 systemd[221061]: Stopped target Basic System.
Oct 02 08:24:42 compute-0 systemd[221061]: Stopped target Paths.
Oct 02 08:24:42 compute-0 systemd[221061]: Stopped target Sockets.
Oct 02 08:24:42 compute-0 systemd[221061]: Stopped target Timers.
Oct 02 08:24:42 compute-0 systemd[221061]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 02 08:24:42 compute-0 systemd[221061]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 02 08:24:42 compute-0 systemd[221061]: Closed D-Bus User Message Bus Socket.
Oct 02 08:24:42 compute-0 systemd[221061]: Stopped Create User's Volatile Files and Directories.
Oct 02 08:24:42 compute-0 systemd[221061]: Removed slice User Application Slice.
Oct 02 08:24:42 compute-0 systemd[221061]: Reached target Shutdown.
Oct 02 08:24:42 compute-0 systemd[221061]: Finished Exit the Session.
Oct 02 08:24:42 compute-0 systemd[221061]: Reached target Exit the Session.
Oct 02 08:24:42 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Oct 02 08:24:42 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Oct 02 08:24:42 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct 02 08:24:42 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct 02 08:24:42 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct 02 08:24:42 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct 02 08:24:42 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.402 2 DEBUG oslo_concurrency.lockutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.403 2 DEBUG oslo_concurrency.lockutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.404 2 DEBUG oslo_concurrency.lockutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "8fa7a424-b570-4b4b-a24b-843ae1bfe666-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.443 2 DEBUG oslo_concurrency.lockutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.444 2 DEBUG oslo_concurrency.lockutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.444 2 DEBUG oslo_concurrency.lockutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.444 2 DEBUG nova.compute.resource_tracker [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.703 2 WARNING nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.705 2 DEBUG nova.compute.resource_tracker [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5893MB free_disk=73.46561431884766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.705 2 DEBUG oslo_concurrency.lockutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.705 2 DEBUG oslo_concurrency.lockutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.772 2 DEBUG nova.compute.resource_tracker [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Migration for instance 8fa7a424-b570-4b4b-a24b-843ae1bfe666 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.801 2 DEBUG nova.compute.resource_tracker [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.841 2 DEBUG nova.compute.resource_tracker [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Migration 894c372e-bf1d-4659-a110-23dab2185368 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.841 2 DEBUG nova.compute.resource_tracker [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.842 2 DEBUG nova.compute.resource_tracker [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.906 2 DEBUG nova.compute.provider_tree [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.927 2 DEBUG nova.scheduler.client.report [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.956 2 DEBUG nova.compute.resource_tracker [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.957 2 DEBUG oslo_concurrency.lockutils [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:45 compute-0 nova_compute[192567]: 2025-10-02 08:24:45.966 2 INFO nova.compute.manager [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Oct 02 08:24:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:45.982 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:24:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:45.982 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:24:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:24:45.983 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:24:46 compute-0 nova_compute[192567]: 2025-10-02 08:24:46.082 2 INFO nova.scheduler.client.report [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Deleted allocation for migration 894c372e-bf1d-4659-a110-23dab2185368
Oct 02 08:24:46 compute-0 nova_compute[192567]: 2025-10-02 08:24:46.083 2 DEBUG nova.virt.libvirt.driver [None req-3255029b-4a36-4c45-a35c-c5966df9b3e8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Oct 02 08:24:46 compute-0 nova_compute[192567]: 2025-10-02 08:24:46.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:47 compute-0 nova_compute[192567]: 2025-10-02 08:24:47.416 2 DEBUG nova.compute.manager [None req-cf149d2a-38d5-4e79-bc14-cb1ac6344541 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Oct 02 08:24:47 compute-0 nova_compute[192567]: 2025-10-02 08:24:47.508 2 DEBUG nova.compute.provider_tree [None req-cf149d2a-38d5-4e79-bc14-cb1ac6344541 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Updating resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e generation from 24 to 27 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 02 08:24:50 compute-0 nova_compute[192567]: 2025-10-02 08:24:50.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:51 compute-0 nova_compute[192567]: 2025-10-02 08:24:51.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:53 compute-0 podman[221297]: 2025-10-02 08:24:53.189039933 +0000 UTC m=+0.097207809 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, version=9.6, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, maintainer=Red Hat, Inc.)
Oct 02 08:24:53 compute-0 nova_compute[192567]: 2025-10-02 08:24:53.850 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759393478.848231, 8fa7a424-b570-4b4b-a24b-843ae1bfe666 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:24:53 compute-0 nova_compute[192567]: 2025-10-02 08:24:53.850 2 INFO nova.compute.manager [-] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] VM Stopped (Lifecycle Event)
Oct 02 08:24:53 compute-0 nova_compute[192567]: 2025-10-02 08:24:53.882 2 DEBUG nova.compute.manager [None req-ddd35e16-e491-4f14-b06c-0af2ca8269b9 - - - - - -] [instance: 8fa7a424-b570-4b4b-a24b-843ae1bfe666] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:24:55 compute-0 nova_compute[192567]: 2025-10-02 08:24:55.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:56 compute-0 nova_compute[192567]: 2025-10-02 08:24:56.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:24:59 compute-0 podman[203011]: time="2025-10-02T08:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:24:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:24:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3006 "" "Go-http-client/1.1"
Oct 02 08:25:00 compute-0 nova_compute[192567]: 2025-10-02 08:25:00.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:01 compute-0 openstack_network_exporter[205118]: ERROR   08:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:25:01 compute-0 openstack_network_exporter[205118]: ERROR   08:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:25:01 compute-0 openstack_network_exporter[205118]: ERROR   08:25:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:25:01 compute-0 openstack_network_exporter[205118]: ERROR   08:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:25:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:25:01 compute-0 openstack_network_exporter[205118]: ERROR   08:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:25:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:25:01 compute-0 nova_compute[192567]: 2025-10-02 08:25:01.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:05 compute-0 nova_compute[192567]: 2025-10-02 08:25:05.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:05 compute-0 podman[221320]: 2025-10-02 08:25:05.221961889 +0000 UTC m=+0.093925957 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 02 08:25:05 compute-0 podman[221318]: 2025-10-02 08:25:05.234389783 +0000 UTC m=+0.125306338 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct 02 08:25:05 compute-0 podman[221326]: 2025-10-02 08:25:05.239022966 +0000 UTC m=+0.113971896 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:25:05 compute-0 podman[221319]: 2025-10-02 08:25:05.265598979 +0000 UTC m=+0.147315169 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:25:06 compute-0 nova_compute[192567]: 2025-10-02 08:25:06.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:10 compute-0 nova_compute[192567]: 2025-10-02 08:25:10.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:11 compute-0 podman[221397]: 2025-10-02 08:25:11.149195546 +0000 UTC m=+0.064199017 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:25:11 compute-0 nova_compute[192567]: 2025-10-02 08:25:11.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:15 compute-0 nova_compute[192567]: 2025-10-02 08:25:15.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:16 compute-0 nova_compute[192567]: 2025-10-02 08:25:16.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:20 compute-0 nova_compute[192567]: 2025-10-02 08:25:20.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:21 compute-0 nova_compute[192567]: 2025-10-02 08:25:21.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:22 compute-0 ovn_controller[94821]: 2025-10-02T08:25:22Z|00142|memory_trim|INFO|Detected inactivity (last active 30018 ms ago): trimming memory
Oct 02 08:25:24 compute-0 podman[221422]: 2025-10-02 08:25:24.174022171 +0000 UTC m=+0.086411795 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, version=9.6, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 02 08:25:25 compute-0 nova_compute[192567]: 2025-10-02 08:25:25.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:26 compute-0 nova_compute[192567]: 2025-10-02 08:25:26.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:29 compute-0 podman[203011]: time="2025-10-02T08:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:25:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:25:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3003 "" "Go-http-client/1.1"
Oct 02 08:25:30 compute-0 nova_compute[192567]: 2025-10-02 08:25:30.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:31 compute-0 openstack_network_exporter[205118]: ERROR   08:25:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:25:31 compute-0 openstack_network_exporter[205118]: ERROR   08:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:25:31 compute-0 openstack_network_exporter[205118]: ERROR   08:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:25:31 compute-0 openstack_network_exporter[205118]: ERROR   08:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:25:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:25:31 compute-0 openstack_network_exporter[205118]: ERROR   08:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:25:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:25:31 compute-0 nova_compute[192567]: 2025-10-02 08:25:31.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.643 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.644 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.670 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.670 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.671 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.671 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.866 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.867 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5895MB free_disk=73.46514892578125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.868 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.868 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.943 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.943 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.965 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing inventories for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.989 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating ProviderTree inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 02 08:25:35 compute-0 nova_compute[192567]: 2025-10-02 08:25:35.990 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:25:36 compute-0 nova_compute[192567]: 2025-10-02 08:25:36.012 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing aggregate associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 02 08:25:36 compute-0 nova_compute[192567]: 2025-10-02 08:25:36.050 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing trait associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 02 08:25:36 compute-0 nova_compute[192567]: 2025-10-02 08:25:36.069 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:25:36 compute-0 nova_compute[192567]: 2025-10-02 08:25:36.089 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:25:36 compute-0 nova_compute[192567]: 2025-10-02 08:25:36.093 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:25:36 compute-0 nova_compute[192567]: 2025-10-02 08:25:36.093 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.225s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:25:36 compute-0 podman[221443]: 2025-10-02 08:25:36.15370915 +0000 UTC m=+0.061162354 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 02 08:25:36 compute-0 podman[221451]: 2025-10-02 08:25:36.185814873 +0000 UTC m=+0.081980007 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 02 08:25:36 compute-0 podman[221445]: 2025-10-02 08:25:36.186582706 +0000 UTC m=+0.091957096 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:25:36 compute-0 podman[221444]: 2025-10-02 08:25:36.206206944 +0000 UTC m=+0.119369944 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 02 08:25:36 compute-0 nova_compute[192567]: 2025-10-02 08:25:36.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:38 compute-0 nova_compute[192567]: 2025-10-02 08:25:38.074 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:25:38 compute-0 nova_compute[192567]: 2025-10-02 08:25:38.076 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:25:38 compute-0 nova_compute[192567]: 2025-10-02 08:25:38.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:25:39 compute-0 nova_compute[192567]: 2025-10-02 08:25:39.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:25:40 compute-0 nova_compute[192567]: 2025-10-02 08:25:40.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:41 compute-0 nova_compute[192567]: 2025-10-02 08:25:41.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:25:41 compute-0 nova_compute[192567]: 2025-10-02 08:25:41.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:25:41 compute-0 nova_compute[192567]: 2025-10-02 08:25:41.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:25:41 compute-0 nova_compute[192567]: 2025-10-02 08:25:41.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:42 compute-0 podman[221522]: 2025-10-02 08:25:42.162940225 +0000 UTC m=+0.077433997 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:25:43 compute-0 nova_compute[192567]: 2025-10-02 08:25:43.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.451 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.452 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.466 2 DEBUG nova.compute.manager [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.557 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.557 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.565 2 DEBUG nova.virt.hardware [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.566 2 INFO nova.compute.claims [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.675 2 DEBUG nova.compute.provider_tree [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.689 2 DEBUG nova.scheduler.client.report [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.708 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.709 2 DEBUG nova.compute.manager [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.751 2 DEBUG nova.compute.manager [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.751 2 DEBUG nova.network.neutron [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.769 2 INFO nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.789 2 DEBUG nova.compute.manager [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.893 2 DEBUG nova.compute.manager [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.894 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.895 2 INFO nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Creating image(s)
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.895 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "/var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.895 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "/var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.896 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "/var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.906 2 DEBUG oslo_concurrency.processutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.977 2 DEBUG oslo_concurrency.processutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.978 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.979 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:25:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:45.983 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:25:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:45.984 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:25:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:45.984 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:25:45 compute-0 nova_compute[192567]: 2025-10-02 08:25:45.994 2 DEBUG oslo_concurrency.processutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.059 2 DEBUG oslo_concurrency.processutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.061 2 DEBUG oslo_concurrency.processutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.100 2 DEBUG oslo_concurrency.processutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.101 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.102 2 DEBUG oslo_concurrency.processutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.191 2 DEBUG oslo_concurrency.processutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.193 2 DEBUG nova.virt.disk.api [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Checking if we can resize image /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.193 2 DEBUG oslo_concurrency.processutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.252 2 DEBUG oslo_concurrency.processutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.254 2 DEBUG nova.virt.disk.api [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Cannot resize image /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.255 2 DEBUG nova.objects.instance [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'migration_context' on Instance uuid d5de7c96-a157-43c1-b00a-4b54c1f7bb1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.276 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.276 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Ensure instance console log exists: /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.277 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.278 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.279 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.630 2 DEBUG nova.network.neutron [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Successfully created port: 3b0883a3-0e37-423b-b7ad-46bd0fa49790 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 02 08:25:46 compute-0 nova_compute[192567]: 2025-10-02 08:25:46.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:47 compute-0 nova_compute[192567]: 2025-10-02 08:25:47.391 2 DEBUG nova.network.neutron [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Successfully updated port: 3b0883a3-0e37-423b-b7ad-46bd0fa49790 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 02 08:25:47 compute-0 nova_compute[192567]: 2025-10-02 08:25:47.405 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "refresh_cache-d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:25:47 compute-0 nova_compute[192567]: 2025-10-02 08:25:47.405 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquired lock "refresh_cache-d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:25:47 compute-0 nova_compute[192567]: 2025-10-02 08:25:47.406 2 DEBUG nova.network.neutron [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:25:47 compute-0 nova_compute[192567]: 2025-10-02 08:25:47.462 2 DEBUG nova.compute.manager [req-def9202a-7d0b-494b-8d8c-a729b56a5ac2 req-2915ee91-3f44-46e0-975f-9d3f9ba8070c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Received event network-changed-3b0883a3-0e37-423b-b7ad-46bd0fa49790 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:25:47 compute-0 nova_compute[192567]: 2025-10-02 08:25:47.463 2 DEBUG nova.compute.manager [req-def9202a-7d0b-494b-8d8c-a729b56a5ac2 req-2915ee91-3f44-46e0-975f-9d3f9ba8070c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Refreshing instance network info cache due to event network-changed-3b0883a3-0e37-423b-b7ad-46bd0fa49790. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:25:47 compute-0 nova_compute[192567]: 2025-10-02 08:25:47.463 2 DEBUG oslo_concurrency.lockutils [req-def9202a-7d0b-494b-8d8c-a729b56a5ac2 req-2915ee91-3f44-46e0-975f-9d3f9ba8070c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:25:47 compute-0 nova_compute[192567]: 2025-10-02 08:25:47.602 2 DEBUG nova.network.neutron [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:25:48 compute-0 unix_chkpwd[221565]: password check failed for user (root)
Oct 02 08:25:48 compute-0 sshd-session[221563]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.330 2 DEBUG nova.network.neutron [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Updating instance_info_cache with network_info: [{"id": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "address": "fa:16:3e:40:61:15", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0883a3-0e", "ovs_interfaceid": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.360 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Releasing lock "refresh_cache-d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.360 2 DEBUG nova.compute.manager [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Instance network_info: |[{"id": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "address": "fa:16:3e:40:61:15", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0883a3-0e", "ovs_interfaceid": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.361 2 DEBUG oslo_concurrency.lockutils [req-def9202a-7d0b-494b-8d8c-a729b56a5ac2 req-2915ee91-3f44-46e0-975f-9d3f9ba8070c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.361 2 DEBUG nova.network.neutron [req-def9202a-7d0b-494b-8d8c-a729b56a5ac2 req-2915ee91-3f44-46e0-975f-9d3f9ba8070c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Refreshing network info cache for port 3b0883a3-0e37-423b-b7ad-46bd0fa49790 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.366 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Start _get_guest_xml network_info=[{"id": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "address": "fa:16:3e:40:61:15", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0883a3-0e", "ovs_interfaceid": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.374 2 WARNING nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.381 2 DEBUG nova.virt.libvirt.host [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.382 2 DEBUG nova.virt.libvirt.host [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.390 2 DEBUG nova.virt.libvirt.host [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.390 2 DEBUG nova.virt.libvirt.host [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.391 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.392 2 DEBUG nova.virt.hardware [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.392 2 DEBUG nova.virt.hardware [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.393 2 DEBUG nova.virt.hardware [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.393 2 DEBUG nova.virt.hardware [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.394 2 DEBUG nova.virt.hardware [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.394 2 DEBUG nova.virt.hardware [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.395 2 DEBUG nova.virt.hardware [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.395 2 DEBUG nova.virt.hardware [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.396 2 DEBUG nova.virt.hardware [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.396 2 DEBUG nova.virt.hardware [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.397 2 DEBUG nova.virt.hardware [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.402 2 DEBUG nova.virt.libvirt.vif [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1435343638',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1435343638',id=18,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-ggy7uluk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:25:45Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=d5de7c96-a157-43c1-b00a-4b54c1f7bb1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "address": "fa:16:3e:40:61:15", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0883a3-0e", "ovs_interfaceid": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.403 2 DEBUG nova.network.os_vif_util [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "address": "fa:16:3e:40:61:15", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0883a3-0e", "ovs_interfaceid": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.404 2 DEBUG nova.network.os_vif_util [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:61:15,bridge_name='br-int',has_traffic_filtering=True,id=3b0883a3-0e37-423b-b7ad-46bd0fa49790,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0883a3-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.406 2 DEBUG nova.objects.instance [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'pci_devices' on Instance uuid d5de7c96-a157-43c1-b00a-4b54c1f7bb1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.428 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:25:48 compute-0 nova_compute[192567]:   <uuid>d5de7c96-a157-43c1-b00a-4b54c1f7bb1c</uuid>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   <name>instance-00000012</name>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteStrategies-server-1435343638</nova:name>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:25:48</nova:creationTime>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:25:48 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:25:48 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:25:48 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:25:48 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:25:48 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:25:48 compute-0 nova_compute[192567]:         <nova:user uuid="bf38fbc8dd7b4c4db6c469a7951b0942">tempest-TestExecuteStrategies-1382092507-project-admin</nova:user>
Oct 02 08:25:48 compute-0 nova_compute[192567]:         <nova:project uuid="1ea832b474574009921dff909e4daeaf">tempest-TestExecuteStrategies-1382092507</nova:project>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:25:48 compute-0 nova_compute[192567]:         <nova:port uuid="3b0883a3-0e37-423b-b7ad-46bd0fa49790">
Oct 02 08:25:48 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <system>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <entry name="serial">d5de7c96-a157-43c1-b00a-4b54c1f7bb1c</entry>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <entry name="uuid">d5de7c96-a157-43c1-b00a-4b54c1f7bb1c</entry>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     </system>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   <os>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   </os>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   <features>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   </features>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk.config"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:40:61:15"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <target dev="tap3b0883a3-0e"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/console.log" append="off"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <video>
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     </video>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:25:48 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:25:48 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:25:48 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:25:48 compute-0 nova_compute[192567]: </domain>
Oct 02 08:25:48 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.429 2 DEBUG nova.compute.manager [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Preparing to wait for external event network-vif-plugged-3b0883a3-0e37-423b-b7ad-46bd0fa49790 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.431 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.431 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.432 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.434 2 DEBUG nova.virt.libvirt.vif [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1435343638',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1435343638',id=18,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-ggy7uluk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:25:45Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=d5de7c96-a157-43c1-b00a-4b54c1f7bb1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "address": "fa:16:3e:40:61:15", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0883a3-0e", "ovs_interfaceid": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.434 2 DEBUG nova.network.os_vif_util [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "address": "fa:16:3e:40:61:15", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0883a3-0e", "ovs_interfaceid": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.436 2 DEBUG nova.network.os_vif_util [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:61:15,bridge_name='br-int',has_traffic_filtering=True,id=3b0883a3-0e37-423b-b7ad-46bd0fa49790,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0883a3-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.437 2 DEBUG os_vif [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:61:15,bridge_name='br-int',has_traffic_filtering=True,id=3b0883a3-0e37-423b-b7ad-46bd0fa49790,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0883a3-0e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.439 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.439 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.448 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b0883a3-0e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.449 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3b0883a3-0e, col_values=(('external_ids', {'iface-id': '3b0883a3-0e37-423b-b7ad-46bd0fa49790', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:40:61:15', 'vm-uuid': 'd5de7c96-a157-43c1-b00a-4b54c1f7bb1c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:48 compute-0 NetworkManager[51654]: <info>  [1759393548.4544] manager: (tap3b0883a3-0e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.464 2 INFO os_vif [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:61:15,bridge_name='br-int',has_traffic_filtering=True,id=3b0883a3-0e37-423b-b7ad-46bd0fa49790,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0883a3-0e')
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.535 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.535 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.536 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] No VIF found with MAC fa:16:3e:40:61:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.536 2 INFO nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Using config drive
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.910 2 INFO nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Creating config drive at /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk.config
Oct 02 08:25:48 compute-0 nova_compute[192567]: 2025-10-02 08:25:48.918 2 DEBUG oslo_concurrency.processutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf4g2v_sd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.055 2 DEBUG oslo_concurrency.processutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf4g2v_sd" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:25:49 compute-0 kernel: tap3b0883a3-0e: entered promiscuous mode
Oct 02 08:25:49 compute-0 ovn_controller[94821]: 2025-10-02T08:25:49Z|00143|binding|INFO|Claiming lport 3b0883a3-0e37-423b-b7ad-46bd0fa49790 for this chassis.
Oct 02 08:25:49 compute-0 ovn_controller[94821]: 2025-10-02T08:25:49Z|00144|binding|INFO|3b0883a3-0e37-423b-b7ad-46bd0fa49790: Claiming fa:16:3e:40:61:15 10.100.0.3
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:49 compute-0 NetworkManager[51654]: <info>  [1759393549.1607] manager: (tap3b0883a3-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.170 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:61:15 10.100.0.3'], port_security=['fa:16:3e:40:61:15 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd5de7c96-a157-43c1-b00a-4b54c1f7bb1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=3b0883a3-0e37-423b-b7ad-46bd0fa49790) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:25:49 compute-0 ovn_controller[94821]: 2025-10-02T08:25:49Z|00145|binding|INFO|Setting lport 3b0883a3-0e37-423b-b7ad-46bd0fa49790 ovn-installed in OVS
Oct 02 08:25:49 compute-0 ovn_controller[94821]: 2025-10-02T08:25:49Z|00146|binding|INFO|Setting lport 3b0883a3-0e37-423b-b7ad-46bd0fa49790 up in Southbound
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.172 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 3b0883a3-0e37-423b-b7ad-46bd0fa49790 in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d bound to our chassis
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.174 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.193 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[aaef6b43-c69b-468c-a3c2-5e8f9bc70481]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.194 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap08b16a0c-b1 in ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.197 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap08b16a0c-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.197 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[276e9d14-1c0f-48e8-ab7f-f87bc49359d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.197 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[fc0a5f6e-2e8f-4892-847a-06fce3a0ebbd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 systemd-udevd[221587]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:25:49 compute-0 systemd-machined[152597]: New machine qemu-13-instance-00000012.
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.220 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[52677131-2cab-4197-8c9e-b7493d6eb971]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 NetworkManager[51654]: <info>  [1759393549.2308] device (tap3b0883a3-0e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:25:49 compute-0 NetworkManager[51654]: <info>  [1759393549.2332] device (tap3b0883a3-0e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:25:49 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000012.
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.254 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e8eab5dc-e75e-4d8c-ad1f-18be5e41d545]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.299 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[10c6b46e-4e21-4924-a473-eb71433e5d0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.308 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[1ddfda86-79e6-45ed-86b5-6d5eb0af3680]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 NetworkManager[51654]: <info>  [1759393549.3099] manager: (tap08b16a0c-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.364 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[24f40571-de03-4136-ad7c-ec1a22a4b126]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.367 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[791e5188-c32f-4ce6-9ebc-ad1b4c0949e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 NetworkManager[51654]: <info>  [1759393549.3957] device (tap08b16a0c-b0): carrier: link connected
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.404 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[08ad4a8e-b167-4594-9795-0388b3ce1c42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.424 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8844713e-311b-48e3-abcd-285999ded24c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443698, 'reachable_time': 36727, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221618, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.448 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8807e7e0-dc2d-4807-8165-4a1a2aa90584]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:c53f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443698, 'tstamp': 443698}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221619, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.472 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[a4fcebbc-0cfe-4691-92f1-63e10e594501]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443698, 'reachable_time': 36727, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221620, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.516 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[b958105f-522f-4770-b3b1-03a016385f1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.551 2 DEBUG nova.compute.manager [req-946d1ace-0821-47ba-b0e8-3c70c562c845 req-73ac9454-6ec1-469d-87d2-3e2b6f9ac103 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Received event network-vif-plugged-3b0883a3-0e37-423b-b7ad-46bd0fa49790 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.552 2 DEBUG oslo_concurrency.lockutils [req-946d1ace-0821-47ba-b0e8-3c70c562c845 req-73ac9454-6ec1-469d-87d2-3e2b6f9ac103 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.553 2 DEBUG oslo_concurrency.lockutils [req-946d1ace-0821-47ba-b0e8-3c70c562c845 req-73ac9454-6ec1-469d-87d2-3e2b6f9ac103 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.553 2 DEBUG oslo_concurrency.lockutils [req-946d1ace-0821-47ba-b0e8-3c70c562c845 req-73ac9454-6ec1-469d-87d2-3e2b6f9ac103 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.553 2 DEBUG nova.compute.manager [req-946d1ace-0821-47ba-b0e8-3c70c562c845 req-73ac9454-6ec1-469d-87d2-3e2b6f9ac103 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Processing event network-vif-plugged-3b0883a3-0e37-423b-b7ad-46bd0fa49790 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.574 2 DEBUG nova.network.neutron [req-def9202a-7d0b-494b-8d8c-a729b56a5ac2 req-2915ee91-3f44-46e0-975f-9d3f9ba8070c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Updated VIF entry in instance network info cache for port 3b0883a3-0e37-423b-b7ad-46bd0fa49790. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.575 2 DEBUG nova.network.neutron [req-def9202a-7d0b-494b-8d8c-a729b56a5ac2 req-2915ee91-3f44-46e0-975f-9d3f9ba8070c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Updating instance_info_cache with network_info: [{"id": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "address": "fa:16:3e:40:61:15", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0883a3-0e", "ovs_interfaceid": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.588 2 DEBUG oslo_concurrency.lockutils [req-def9202a-7d0b-494b-8d8c-a729b56a5ac2 req-2915ee91-3f44-46e0-975f-9d3f9ba8070c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.597 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8f606375-bff8-4b53-9835-b4dfcecc84f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.600 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.600 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.601 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b16a0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:25:49 compute-0 NetworkManager[51654]: <info>  [1759393549.6034] manager: (tap08b16a0c-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:49 compute-0 kernel: tap08b16a0c-b0: entered promiscuous mode
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.605 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.607 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08b16a0c-b0, col_values=(('external_ids', {'iface-id': '748eef31-77a8-4b04-b6b7-dc0f7cc1cf65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:49 compute-0 ovn_controller[94821]: 2025-10-02T08:25:49Z|00147|binding|INFO|Releasing lport 748eef31-77a8-4b04-b6b7-dc0f7cc1cf65 from this chassis (sb_readonly=0)
Oct 02 08:25:49 compute-0 nova_compute[192567]: 2025-10-02 08:25:49.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.628 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.629 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec93d34-335c-4298-b72d-b2b17334c5fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.630 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:25:49 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:49.631 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'env', 'PROCESS_TAG=haproxy-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/08b16a0c-b69f-4a34-9bfe-830099adfe8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.008 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393550.0075967, d5de7c96-a157-43c1-b00a-4b54c1f7bb1c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.008 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] VM Started (Lifecycle Event)
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.010 2 DEBUG nova.compute.manager [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.014 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.017 2 INFO nova.virt.libvirt.driver [-] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Instance spawned successfully.
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.017 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.033 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.038 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.042 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.043 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.043 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.043 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.044 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.044 2 DEBUG nova.virt.libvirt.driver [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.076 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.077 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393550.0086825, d5de7c96-a157-43c1-b00a-4b54c1f7bb1c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.077 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] VM Paused (Lifecycle Event)
Oct 02 08:25:50 compute-0 podman[221657]: 2025-10-02 08:25:50.086268993 +0000 UTC m=+0.081170190 container create 0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.102 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.108 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393550.0129566, d5de7c96-a157-43c1-b00a-4b54c1f7bb1c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.108 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] VM Resumed (Lifecycle Event)
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.111 2 INFO nova.compute.manager [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Took 4.22 seconds to spawn the instance on the hypervisor.
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.112 2 DEBUG nova.compute.manager [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.124 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.127 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:25:50 compute-0 systemd[1]: Started libpod-conmon-0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91.scope.
Oct 02 08:25:50 compute-0 podman[221657]: 2025-10-02 08:25:50.045597701 +0000 UTC m=+0.040498918 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.151 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:25:50 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:25:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c7a70d6e41bd3f93a0b4b67a855057690728e39903bd66c4c1c54ac8f86d902/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.175 2 INFO nova.compute.manager [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Took 4.65 seconds to build instance.
Oct 02 08:25:50 compute-0 podman[221657]: 2025-10-02 08:25:50.192207946 +0000 UTC m=+0.187109183 container init 0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct 02 08:25:50 compute-0 nova_compute[192567]: 2025-10-02 08:25:50.194 2 DEBUG oslo_concurrency.lockutils [None req-d5608bc7-dc75-4e7b-9fdc-36a8b1029d9b bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:25:50 compute-0 podman[221657]: 2025-10-02 08:25:50.197729257 +0000 UTC m=+0.192630474 container start 0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:25:50 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[221672]: [NOTICE]   (221677) : New worker (221679) forked
Oct 02 08:25:50 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[221672]: [NOTICE]   (221677) : Loading success.
Oct 02 08:25:50 compute-0 sshd-session[221563]: Failed password for root from 193.46.255.217 port 44104 ssh2
Oct 02 08:25:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:50.262 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:25:51 compute-0 nova_compute[192567]: 2025-10-02 08:25:51.632 2 DEBUG nova.compute.manager [req-7d4051cc-1b68-45d0-8ded-bc052bdea765 req-62b47c23-7eb1-4d19-9b3d-9eb47784a980 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Received event network-vif-plugged-3b0883a3-0e37-423b-b7ad-46bd0fa49790 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:25:51 compute-0 nova_compute[192567]: 2025-10-02 08:25:51.632 2 DEBUG oslo_concurrency.lockutils [req-7d4051cc-1b68-45d0-8ded-bc052bdea765 req-62b47c23-7eb1-4d19-9b3d-9eb47784a980 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:25:51 compute-0 nova_compute[192567]: 2025-10-02 08:25:51.632 2 DEBUG oslo_concurrency.lockutils [req-7d4051cc-1b68-45d0-8ded-bc052bdea765 req-62b47c23-7eb1-4d19-9b3d-9eb47784a980 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:25:51 compute-0 nova_compute[192567]: 2025-10-02 08:25:51.633 2 DEBUG oslo_concurrency.lockutils [req-7d4051cc-1b68-45d0-8ded-bc052bdea765 req-62b47c23-7eb1-4d19-9b3d-9eb47784a980 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:25:51 compute-0 nova_compute[192567]: 2025-10-02 08:25:51.633 2 DEBUG nova.compute.manager [req-7d4051cc-1b68-45d0-8ded-bc052bdea765 req-62b47c23-7eb1-4d19-9b3d-9eb47784a980 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] No waiting events found dispatching network-vif-plugged-3b0883a3-0e37-423b-b7ad-46bd0fa49790 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:25:51 compute-0 nova_compute[192567]: 2025-10-02 08:25:51.633 2 WARNING nova.compute.manager [req-7d4051cc-1b68-45d0-8ded-bc052bdea765 req-62b47c23-7eb1-4d19-9b3d-9eb47784a980 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Received unexpected event network-vif-plugged-3b0883a3-0e37-423b-b7ad-46bd0fa49790 for instance with vm_state active and task_state None.
Oct 02 08:25:51 compute-0 nova_compute[192567]: 2025-10-02 08:25:51.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:52 compute-0 unix_chkpwd[221688]: password check failed for user (root)
Oct 02 08:25:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:25:53.271 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:25:53 compute-0 nova_compute[192567]: 2025-10-02 08:25:53.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:53 compute-0 sshd-session[221563]: Failed password for root from 193.46.255.217 port 44104 ssh2
Oct 02 08:25:54 compute-0 unix_chkpwd[221689]: password check failed for user (root)
Oct 02 08:25:55 compute-0 podman[221690]: 2025-10-02 08:25:55.205635887 +0000 UTC m=+0.109825294 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:25:56 compute-0 sshd-session[221563]: Failed password for root from 193.46.255.217 port 44104 ssh2
Oct 02 08:25:56 compute-0 nova_compute[192567]: 2025-10-02 08:25:56.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:58 compute-0 nova_compute[192567]: 2025-10-02 08:25:58.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:25:58 compute-0 sshd-session[221563]: Received disconnect from 193.46.255.217 port 44104:11:  [preauth]
Oct 02 08:25:58 compute-0 sshd-session[221563]: Disconnected from authenticating user root 193.46.255.217 port 44104 [preauth]
Oct 02 08:25:58 compute-0 sshd-session[221563]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Oct 02 08:25:59 compute-0 unix_chkpwd[221711]: password check failed for user (root)
Oct 02 08:25:59 compute-0 sshd-session[221709]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Oct 02 08:25:59 compute-0 podman[203011]: time="2025-10-02T08:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:25:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:25:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3461 "" "Go-http-client/1.1"
Oct 02 08:26:01 compute-0 ovn_controller[94821]: 2025-10-02T08:26:01Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:40:61:15 10.100.0.3
Oct 02 08:26:01 compute-0 ovn_controller[94821]: 2025-10-02T08:26:01Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:40:61:15 10.100.0.3
Oct 02 08:26:01 compute-0 openstack_network_exporter[205118]: ERROR   08:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:26:01 compute-0 openstack_network_exporter[205118]: ERROR   08:26:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:26:01 compute-0 openstack_network_exporter[205118]: ERROR   08:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:26:01 compute-0 openstack_network_exporter[205118]: ERROR   08:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:26:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:26:01 compute-0 openstack_network_exporter[205118]: ERROR   08:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:26:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:26:01 compute-0 sshd-session[221709]: Failed password for root from 193.46.255.217 port 63572 ssh2
Oct 02 08:26:01 compute-0 nova_compute[192567]: 2025-10-02 08:26:01.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:03 compute-0 nova_compute[192567]: 2025-10-02 08:26:03.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:03 compute-0 unix_chkpwd[221721]: password check failed for user (root)
Oct 02 08:26:06 compute-0 sshd-session[221709]: Failed password for root from 193.46.255.217 port 63572 ssh2
Oct 02 08:26:07 compute-0 nova_compute[192567]: 2025-10-02 08:26:07.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:07 compute-0 podman[221724]: 2025-10-02 08:26:07.206677142 +0000 UTC m=+0.095611836 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 02 08:26:07 compute-0 podman[221722]: 2025-10-02 08:26:07.222567321 +0000 UTC m=+0.122613727 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 02 08:26:07 compute-0 podman[221725]: 2025-10-02 08:26:07.22767904 +0000 UTC m=+0.116657284 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 02 08:26:07 compute-0 podman[221723]: 2025-10-02 08:26:07.24686707 +0000 UTC m=+0.141859790 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct 02 08:26:07 compute-0 unix_chkpwd[221806]: password check failed for user (root)
Oct 02 08:26:08 compute-0 nova_compute[192567]: 2025-10-02 08:26:08.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:10 compute-0 sshd-session[221709]: Failed password for root from 193.46.255.217 port 63572 ssh2
Oct 02 08:26:11 compute-0 sshd-session[221709]: Received disconnect from 193.46.255.217 port 63572:11:  [preauth]
Oct 02 08:26:11 compute-0 sshd-session[221709]: Disconnected from authenticating user root 193.46.255.217 port 63572 [preauth]
Oct 02 08:26:11 compute-0 sshd-session[221709]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Oct 02 08:26:12 compute-0 nova_compute[192567]: 2025-10-02 08:26:12.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:12 compute-0 unix_chkpwd[221809]: password check failed for user (root)
Oct 02 08:26:12 compute-0 sshd-session[221807]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Oct 02 08:26:13 compute-0 podman[221810]: 2025-10-02 08:26:13.193401989 +0000 UTC m=+0.103684125 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:26:13 compute-0 nova_compute[192567]: 2025-10-02 08:26:13.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:14 compute-0 sshd-session[221807]: Failed password for root from 193.46.255.217 port 16552 ssh2
Oct 02 08:26:14 compute-0 unix_chkpwd[221834]: password check failed for user (root)
Oct 02 08:26:17 compute-0 nova_compute[192567]: 2025-10-02 08:26:17.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:17 compute-0 sshd-session[221807]: Failed password for root from 193.46.255.217 port 16552 ssh2
Oct 02 08:26:18 compute-0 nova_compute[192567]: 2025-10-02 08:26:18.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:19 compute-0 ovn_controller[94821]: 2025-10-02T08:26:19Z|00148|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Oct 02 08:26:19 compute-0 unix_chkpwd[221835]: password check failed for user (root)
Oct 02 08:26:21 compute-0 sshd-session[221807]: Failed password for root from 193.46.255.217 port 16552 ssh2
Oct 02 08:26:22 compute-0 nova_compute[192567]: 2025-10-02 08:26:22.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:23 compute-0 sshd-session[221807]: Received disconnect from 193.46.255.217 port 16552:11:  [preauth]
Oct 02 08:26:23 compute-0 sshd-session[221807]: Disconnected from authenticating user root 193.46.255.217 port 16552 [preauth]
Oct 02 08:26:23 compute-0 sshd-session[221807]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Oct 02 08:26:23 compute-0 nova_compute[192567]: 2025-10-02 08:26:23.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:26 compute-0 podman[221837]: 2025-10-02 08:26:26.212618213 +0000 UTC m=+0.117907473 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal, vcs-type=git, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Oct 02 08:26:27 compute-0 nova_compute[192567]: 2025-10-02 08:26:27.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:28 compute-0 nova_compute[192567]: 2025-10-02 08:26:28.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:29 compute-0 podman[203011]: time="2025-10-02T08:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:26:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:26:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3470 "" "Go-http-client/1.1"
Oct 02 08:26:31 compute-0 openstack_network_exporter[205118]: ERROR   08:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:26:31 compute-0 openstack_network_exporter[205118]: ERROR   08:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:26:31 compute-0 openstack_network_exporter[205118]: ERROR   08:26:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:26:31 compute-0 openstack_network_exporter[205118]: ERROR   08:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:26:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:26:31 compute-0 openstack_network_exporter[205118]: ERROR   08:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:26:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:26:32 compute-0 nova_compute[192567]: 2025-10-02 08:26:32.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:33 compute-0 nova_compute[192567]: 2025-10-02 08:26:33.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:34 compute-0 nova_compute[192567]: 2025-10-02 08:26:34.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:26:34 compute-0 nova_compute[192567]: 2025-10-02 08:26:34.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 02 08:26:34 compute-0 nova_compute[192567]: 2025-10-02 08:26:34.639 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 02 08:26:34 compute-0 nova_compute[192567]: 2025-10-02 08:26:34.639 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:26:34 compute-0 nova_compute[192567]: 2025-10-02 08:26:34.639 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 02 08:26:35 compute-0 nova_compute[192567]: 2025-10-02 08:26:35.659 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:26:35 compute-0 nova_compute[192567]: 2025-10-02 08:26:35.660 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:26:35 compute-0 nova_compute[192567]: 2025-10-02 08:26:35.661 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:26:36 compute-0 nova_compute[192567]: 2025-10-02 08:26:36.123 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:26:36 compute-0 nova_compute[192567]: 2025-10-02 08:26:36.123 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:26:36 compute-0 nova_compute[192567]: 2025-10-02 08:26:36.124 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:26:36 compute-0 nova_compute[192567]: 2025-10-02 08:26:36.124 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d5de7c96-a157-43c1-b00a-4b54c1f7bb1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:26:37 compute-0 nova_compute[192567]: 2025-10-02 08:26:37.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:37 compute-0 nova_compute[192567]: 2025-10-02 08:26:37.748 2 DEBUG nova.virt.libvirt.driver [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Creating tmpfile /var/lib/nova/instances/tmpx6l99gwb to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Oct 02 08:26:37 compute-0 nova_compute[192567]: 2025-10-02 08:26:37.863 2 DEBUG nova.compute.manager [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx6l99gwb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.078 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Updating instance_info_cache with network_info: [{"id": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "address": "fa:16:3e:40:61:15", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0883a3-0e", "ovs_interfaceid": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.093 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.094 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.094 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.125 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.126 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.126 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.126 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:26:38 compute-0 podman[221860]: 2025-10-02 08:26:38.203132562 +0000 UTC m=+0.107571845 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.213 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:26:38 compute-0 podman[221861]: 2025-10-02 08:26:38.244476584 +0000 UTC m=+0.121258675 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 02 08:26:38 compute-0 podman[221858]: 2025-10-02 08:26:38.245755084 +0000 UTC m=+0.153647213 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Oct 02 08:26:38 compute-0 podman[221859]: 2025-10-02 08:26:38.263535271 +0000 UTC m=+0.167496889 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.292 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.294 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.362 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.613 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.614 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5679MB free_disk=73.43583297729492GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.615 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.615 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.674 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Migration for instance f7e8d3db-fabd-4d87-9569-a348f1b9788c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.692 2 INFO nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Updating resource usage from migration eee64733-296f-41d7-a7dc-85c5fc5f5df6
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.693 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Starting to track incoming migration eee64733-296f-41d7-a7dc-85c5fc5f5df6 with flavor 932d352e-81e8-4137-94d3-19616d5c2ae2 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.730 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance d5de7c96-a157-43c1-b00a-4b54c1f7bb1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.750 2 WARNING nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance f7e8d3db-fabd-4d87-9569-a348f1b9788c has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.750 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.750 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.791 2 DEBUG nova.compute.manager [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx6l99gwb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f7e8d3db-fabd-4d87-9569-a348f1b9788c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.822 2 DEBUG oslo_concurrency.lockutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-f7e8d3db-fabd-4d87-9569-a348f1b9788c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.822 2 DEBUG oslo_concurrency.lockutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-f7e8d3db-fabd-4d87-9569-a348f1b9788c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.823 2 DEBUG nova.network.neutron [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.853 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.868 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.884 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:26:38 compute-0 nova_compute[192567]: 2025-10-02 08:26:38.884 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.170 2 DEBUG nova.network.neutron [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Updating instance_info_cache with network_info: [{"id": "e27cfaea-ded9-45c4-87af-3213584cf98e", "address": "fa:16:3e:56:41:a6", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape27cfaea-de", "ovs_interfaceid": "e27cfaea-ded9-45c4-87af-3213584cf98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.192 2 DEBUG oslo_concurrency.lockutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-f7e8d3db-fabd-4d87-9569-a348f1b9788c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.195 2 DEBUG nova.virt.libvirt.driver [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx6l99gwb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f7e8d3db-fabd-4d87-9569-a348f1b9788c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.196 2 DEBUG nova.virt.libvirt.driver [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Creating instance directory: /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.196 2 DEBUG nova.virt.libvirt.driver [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Creating disk.info with the contents: {'/var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk': 'qcow2', '/var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.197 2 DEBUG nova.virt.libvirt.driver [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.198 2 DEBUG nova.objects.instance [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f7e8d3db-fabd-4d87-9569-a348f1b9788c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.237 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.323 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.326 2 DEBUG oslo_concurrency.lockutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.327 2 DEBUG oslo_concurrency.lockutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.355 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.414 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.415 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.415 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.416 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.440 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.442 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.474 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.475 2 DEBUG oslo_concurrency.lockutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.476 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.533 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.535 2 DEBUG nova.virt.disk.api [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.536 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.586 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.588 2 DEBUG nova.virt.disk.api [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.589 2 DEBUG nova.objects.instance [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid f7e8d3db-fabd-4d87-9569-a348f1b9788c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.614 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.641 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk.config 485376" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.644 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk.config to /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Oct 02 08:26:40 compute-0 nova_compute[192567]: 2025-10-02 08:26:40.644 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk.config /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.170 2 DEBUG oslo_concurrency.processutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c/disk.config /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.172 2 DEBUG nova.virt.libvirt.driver [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.174 2 DEBUG nova.virt.libvirt.vif [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:25:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1662760619',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1662760619',id=17,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:25:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-lcbhr566',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:25:38Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=f7e8d3db-fabd-4d87-9569-a348f1b9788c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e27cfaea-ded9-45c4-87af-3213584cf98e", "address": "fa:16:3e:56:41:a6", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape27cfaea-de", "ovs_interfaceid": "e27cfaea-ded9-45c4-87af-3213584cf98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.175 2 DEBUG nova.network.os_vif_util [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "e27cfaea-ded9-45c4-87af-3213584cf98e", "address": "fa:16:3e:56:41:a6", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape27cfaea-de", "ovs_interfaceid": "e27cfaea-ded9-45c4-87af-3213584cf98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.177 2 DEBUG nova.network.os_vif_util [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:56:41:a6,bridge_name='br-int',has_traffic_filtering=True,id=e27cfaea-ded9-45c4-87af-3213584cf98e,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape27cfaea-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.178 2 DEBUG os_vif [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:41:a6,bridge_name='br-int',has_traffic_filtering=True,id=e27cfaea-ded9-45c4-87af-3213584cf98e,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape27cfaea-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.180 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.181 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.186 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape27cfaea-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.187 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape27cfaea-de, col_values=(('external_ids', {'iface-id': 'e27cfaea-ded9-45c4-87af-3213584cf98e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:41:a6', 'vm-uuid': 'f7e8d3db-fabd-4d87-9569-a348f1b9788c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:41 compute-0 NetworkManager[51654]: <info>  [1759393601.1916] manager: (tape27cfaea-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.201 2 INFO os_vif [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:41:a6,bridge_name='br-int',has_traffic_filtering=True,id=e27cfaea-ded9-45c4-87af-3213584cf98e,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape27cfaea-de')
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.201 2 DEBUG nova.virt.libvirt.driver [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Oct 02 08:26:41 compute-0 nova_compute[192567]: 2025-10-02 08:26:41.202 2 DEBUG nova.compute.manager [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx6l99gwb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f7e8d3db-fabd-4d87-9569-a348f1b9788c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Oct 02 08:26:42 compute-0 nova_compute[192567]: 2025-10-02 08:26:42.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:42 compute-0 nova_compute[192567]: 2025-10-02 08:26:42.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:26:43 compute-0 nova_compute[192567]: 2025-10-02 08:26:43.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:26:43 compute-0 nova_compute[192567]: 2025-10-02 08:26:43.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:26:44 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:44.109 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:26:44 compute-0 nova_compute[192567]: 2025-10-02 08:26:44.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:44 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:44.112 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:26:44 compute-0 podman[221963]: 2025-10-02 08:26:44.194440599 +0000 UTC m=+0.096405779 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:26:45 compute-0 nova_compute[192567]: 2025-10-02 08:26:45.176 2 DEBUG nova.network.neutron [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Port e27cfaea-ded9-45c4-87af-3213584cf98e updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Oct 02 08:26:45 compute-0 nova_compute[192567]: 2025-10-02 08:26:45.178 2 DEBUG nova.compute.manager [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx6l99gwb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f7e8d3db-fabd-4d87-9569-a348f1b9788c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Oct 02 08:26:45 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 02 08:26:45 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 02 08:26:45 compute-0 NetworkManager[51654]: <info>  [1759393605.5690] manager: (tape27cfaea-de): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Oct 02 08:26:45 compute-0 kernel: tape27cfaea-de: entered promiscuous mode
Oct 02 08:26:45 compute-0 nova_compute[192567]: 2025-10-02 08:26:45.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:45 compute-0 ovn_controller[94821]: 2025-10-02T08:26:45Z|00149|binding|INFO|Claiming lport e27cfaea-ded9-45c4-87af-3213584cf98e for this additional chassis.
Oct 02 08:26:45 compute-0 ovn_controller[94821]: 2025-10-02T08:26:45Z|00150|binding|INFO|e27cfaea-ded9-45c4-87af-3213584cf98e: Claiming fa:16:3e:56:41:a6 10.100.0.11
Oct 02 08:26:45 compute-0 ovn_controller[94821]: 2025-10-02T08:26:45Z|00151|binding|INFO|Setting lport e27cfaea-ded9-45c4-87af-3213584cf98e ovn-installed in OVS
Oct 02 08:26:45 compute-0 nova_compute[192567]: 2025-10-02 08:26:45.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:45 compute-0 nova_compute[192567]: 2025-10-02 08:26:45.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:45 compute-0 systemd-udevd[222018]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:26:45 compute-0 systemd-machined[152597]: New machine qemu-14-instance-00000011.
Oct 02 08:26:45 compute-0 NetworkManager[51654]: <info>  [1759393605.6323] device (tape27cfaea-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:26:45 compute-0 NetworkManager[51654]: <info>  [1759393605.6339] device (tape27cfaea-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:26:45 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000011.
Oct 02 08:26:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:45.985 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:26:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:45.988 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:26:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:45.989 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:26:46 compute-0 nova_compute[192567]: 2025-10-02 08:26:46.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:46 compute-0 nova_compute[192567]: 2025-10-02 08:26:46.919 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393606.9190419, f7e8d3db-fabd-4d87-9569-a348f1b9788c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:26:46 compute-0 nova_compute[192567]: 2025-10-02 08:26:46.920 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] VM Started (Lifecycle Event)
Oct 02 08:26:46 compute-0 nova_compute[192567]: 2025-10-02 08:26:46.945 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:26:47 compute-0 nova_compute[192567]: 2025-10-02 08:26:47.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:47 compute-0 nova_compute[192567]: 2025-10-02 08:26:47.691 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393607.6909163, f7e8d3db-fabd-4d87-9569-a348f1b9788c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:26:47 compute-0 nova_compute[192567]: 2025-10-02 08:26:47.692 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] VM Resumed (Lifecycle Event)
Oct 02 08:26:47 compute-0 nova_compute[192567]: 2025-10-02 08:26:47.712 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:26:47 compute-0 nova_compute[192567]: 2025-10-02 08:26:47.716 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:26:47 compute-0 nova_compute[192567]: 2025-10-02 08:26:47.739 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Oct 02 08:26:48 compute-0 nova_compute[192567]: 2025-10-02 08:26:48.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:26:48 compute-0 ovn_controller[94821]: 2025-10-02T08:26:48Z|00152|binding|INFO|Claiming lport e27cfaea-ded9-45c4-87af-3213584cf98e for this chassis.
Oct 02 08:26:48 compute-0 ovn_controller[94821]: 2025-10-02T08:26:48Z|00153|binding|INFO|e27cfaea-ded9-45c4-87af-3213584cf98e: Claiming fa:16:3e:56:41:a6 10.100.0.11
Oct 02 08:26:48 compute-0 ovn_controller[94821]: 2025-10-02T08:26:48Z|00154|binding|INFO|Setting lport e27cfaea-ded9-45c4-87af-3213584cf98e up in Southbound
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.756 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:41:a6 10.100.0.11'], port_security=['fa:16:3e:56:41:a6 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f7e8d3db-fabd-4d87-9569-a348f1b9788c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=e27cfaea-ded9-45c4-87af-3213584cf98e) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.758 103703 INFO neutron.agent.ovn.metadata.agent [-] Port e27cfaea-ded9-45c4-87af-3213584cf98e in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d bound to our chassis
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.760 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.787 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[efd52e03-8c50-4a64-8af5-09cd21127c80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.832 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[43d299aa-7401-4e98-bfab-3ed4444993ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.839 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[6362be25-ecec-4261-89db-ef17e63bf725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.894 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[85f4a0db-425a-4afd-90b6-79752bf13fc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.923 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[85821464-b74e-4734-8017-200ce8a127b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 6, 'rx_bytes': 1126, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 6, 'rx_bytes': 1126, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443698, 'reachable_time': 36727, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222054, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.949 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[0a094d76-3006-403b-a424-2ab78ac58bcc]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap08b16a0c-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443714, 'tstamp': 443714}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222055, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap08b16a0c-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443718, 'tstamp': 443718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222055, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.951 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:26:48 compute-0 nova_compute[192567]: 2025-10-02 08:26:48.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:48 compute-0 nova_compute[192567]: 2025-10-02 08:26:48.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.956 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b16a0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.957 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.958 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08b16a0c-b0, col_values=(('external_ids', {'iface-id': '748eef31-77a8-4b04-b6b7-dc0f7cc1cf65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:26:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:48.958 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:26:49 compute-0 nova_compute[192567]: 2025-10-02 08:26:49.577 2 INFO nova.compute.manager [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Post operation of migration started
Oct 02 08:26:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:50.115 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:26:50 compute-0 nova_compute[192567]: 2025-10-02 08:26:50.156 2 DEBUG oslo_concurrency.lockutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-f7e8d3db-fabd-4d87-9569-a348f1b9788c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:26:50 compute-0 nova_compute[192567]: 2025-10-02 08:26:50.157 2 DEBUG oslo_concurrency.lockutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-f7e8d3db-fabd-4d87-9569-a348f1b9788c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:26:50 compute-0 nova_compute[192567]: 2025-10-02 08:26:50.158 2 DEBUG nova.network.neutron [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:26:50 compute-0 nova_compute[192567]: 2025-10-02 08:26:50.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:26:51 compute-0 nova_compute[192567]: 2025-10-02 08:26:51.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:51 compute-0 nova_compute[192567]: 2025-10-02 08:26:51.823 2 DEBUG nova.network.neutron [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Updating instance_info_cache with network_info: [{"id": "e27cfaea-ded9-45c4-87af-3213584cf98e", "address": "fa:16:3e:56:41:a6", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape27cfaea-de", "ovs_interfaceid": "e27cfaea-ded9-45c4-87af-3213584cf98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:26:51 compute-0 nova_compute[192567]: 2025-10-02 08:26:51.846 2 DEBUG oslo_concurrency.lockutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-f7e8d3db-fabd-4d87-9569-a348f1b9788c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:26:51 compute-0 nova_compute[192567]: 2025-10-02 08:26:51.870 2 DEBUG oslo_concurrency.lockutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:26:51 compute-0 nova_compute[192567]: 2025-10-02 08:26:51.871 2 DEBUG oslo_concurrency.lockutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:26:51 compute-0 nova_compute[192567]: 2025-10-02 08:26:51.871 2 DEBUG oslo_concurrency.lockutils [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:26:51 compute-0 nova_compute[192567]: 2025-10-02 08:26:51.878 2 INFO nova.virt.libvirt.driver [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 02 08:26:51 compute-0 virtqemud[192112]: Domain id=14 name='instance-00000011' uuid=f7e8d3db-fabd-4d87-9569-a348f1b9788c is tainted: custom-monitor
Oct 02 08:26:52 compute-0 nova_compute[192567]: 2025-10-02 08:26:52.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:52 compute-0 nova_compute[192567]: 2025-10-02 08:26:52.887 2 INFO nova.virt.libvirt.driver [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 02 08:26:53 compute-0 nova_compute[192567]: 2025-10-02 08:26:53.895 2 INFO nova.virt.libvirt.driver [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 02 08:26:53 compute-0 nova_compute[192567]: 2025-10-02 08:26:53.902 2 DEBUG nova.compute.manager [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:26:53 compute-0 nova_compute[192567]: 2025-10-02 08:26:53.936 2 DEBUG nova.objects.instance [None req-0ecf333f-b434-41a3-b25b-d969bc1027e9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Oct 02 08:26:56 compute-0 nova_compute[192567]: 2025-10-02 08:26:56.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:57 compute-0 podman[222056]: 2025-10-02 08:26:57.184258878 +0000 UTC m=+0.089685834 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 02 08:26:57 compute-0 nova_compute[192567]: 2025-10-02 08:26:57.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.308 2 DEBUG oslo_concurrency.lockutils [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.308 2 DEBUG oslo_concurrency.lockutils [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.309 2 DEBUG oslo_concurrency.lockutils [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.310 2 DEBUG oslo_concurrency.lockutils [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.310 2 DEBUG oslo_concurrency.lockutils [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.312 2 INFO nova.compute.manager [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Terminating instance
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.314 2 DEBUG nova.compute.manager [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:26:59 compute-0 kernel: tap3b0883a3-0e (unregistering): left promiscuous mode
Oct 02 08:26:59 compute-0 NetworkManager[51654]: <info>  [1759393619.3455] device (tap3b0883a3-0e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:59 compute-0 ovn_controller[94821]: 2025-10-02T08:26:59Z|00155|binding|INFO|Releasing lport 3b0883a3-0e37-423b-b7ad-46bd0fa49790 from this chassis (sb_readonly=0)
Oct 02 08:26:59 compute-0 ovn_controller[94821]: 2025-10-02T08:26:59Z|00156|binding|INFO|Setting lport 3b0883a3-0e37-423b-b7ad-46bd0fa49790 down in Southbound
Oct 02 08:26:59 compute-0 ovn_controller[94821]: 2025-10-02T08:26:59Z|00157|binding|INFO|Removing iface tap3b0883a3-0e ovn-installed in OVS
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.367 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:61:15 10.100.0.3'], port_security=['fa:16:3e:40:61:15 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd5de7c96-a157-43c1-b00a-4b54c1f7bb1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=3b0883a3-0e37-423b-b7ad-46bd0fa49790) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.369 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 3b0883a3-0e37-423b-b7ad-46bd0fa49790 in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d unbound from our chassis
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.370 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.405 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[d64a391f-1c33-4973-b4c0-8f6c489038d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:26:59 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000012.scope: Deactivated successfully.
Oct 02 08:26:59 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000012.scope: Consumed 15.026s CPU time.
Oct 02 08:26:59 compute-0 systemd-machined[152597]: Machine qemu-13-instance-00000012 terminated.
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.456 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c79b6e-02d9-4cec-acd0-6b52298ac27b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.461 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[1d8b4a15-e1ff-45dd-9eda-bbaa5dfb61c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.513 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[5a4dc954-955a-45cf-929e-d5201b2ed553]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.541 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[635b45cd-1506-4340-b10b-bf52da17c2a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443698, 'reachable_time': 36727, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222090, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.565 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[1b327eda-677a-42b7-8945-9c5dd657202a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap08b16a0c-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443714, 'tstamp': 443714}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222096, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap08b16a0c-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443718, 'tstamp': 443718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222096, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.569 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.582 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b16a0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.582 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.583 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08b16a0c-b0, col_values=(('external_ids', {'iface-id': '748eef31-77a8-4b04-b6b7-dc0f7cc1cf65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:26:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:26:59.584 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.622 2 INFO nova.virt.libvirt.driver [-] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Instance destroyed successfully.
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.623 2 DEBUG nova.objects.instance [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'resources' on Instance uuid d5de7c96-a157-43c1-b00a-4b54c1f7bb1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.647 2 DEBUG nova.virt.libvirt.vif [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:25:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1435343638',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1435343638',id=18,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:25:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-ggy7uluk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:25:50Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=d5de7c96-a157-43c1-b00a-4b54c1f7bb1c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "address": "fa:16:3e:40:61:15", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0883a3-0e", "ovs_interfaceid": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.647 2 DEBUG nova.network.os_vif_util [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "address": "fa:16:3e:40:61:15", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b0883a3-0e", "ovs_interfaceid": "3b0883a3-0e37-423b-b7ad-46bd0fa49790", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.648 2 DEBUG nova.network.os_vif_util [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:40:61:15,bridge_name='br-int',has_traffic_filtering=True,id=3b0883a3-0e37-423b-b7ad-46bd0fa49790,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0883a3-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.648 2 DEBUG os_vif [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:40:61:15,bridge_name='br-int',has_traffic_filtering=True,id=3b0883a3-0e37-423b-b7ad-46bd0fa49790,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0883a3-0e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.651 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b0883a3-0e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.658 2 INFO os_vif [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:40:61:15,bridge_name='br-int',has_traffic_filtering=True,id=3b0883a3-0e37-423b-b7ad-46bd0fa49790,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b0883a3-0e')
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.659 2 INFO nova.virt.libvirt.driver [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Deleting instance files /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c_del
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.660 2 INFO nova.virt.libvirt.driver [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Deletion of /var/lib/nova/instances/d5de7c96-a157-43c1-b00a-4b54c1f7bb1c_del complete
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.714 2 INFO nova.compute.manager [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Took 0.40 seconds to destroy the instance on the hypervisor.
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.715 2 DEBUG oslo.service.loopingcall [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.715 2 DEBUG nova.compute.manager [-] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:26:59 compute-0 nova_compute[192567]: 2025-10-02 08:26:59.715 2 DEBUG nova.network.neutron [-] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:26:59 compute-0 podman[203011]: time="2025-10-02T08:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:26:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:26:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3469 "" "Go-http-client/1.1"
Oct 02 08:27:00 compute-0 nova_compute[192567]: 2025-10-02 08:27:00.377 2 DEBUG nova.compute.manager [req-041f5ef3-6c2b-4242-9d6e-9dfee867ba1a req-e8517637-c39e-430b-aee4-886669e1f0d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Received event network-vif-unplugged-3b0883a3-0e37-423b-b7ad-46bd0fa49790 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:27:00 compute-0 nova_compute[192567]: 2025-10-02 08:27:00.378 2 DEBUG oslo_concurrency.lockutils [req-041f5ef3-6c2b-4242-9d6e-9dfee867ba1a req-e8517637-c39e-430b-aee4-886669e1f0d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:00 compute-0 nova_compute[192567]: 2025-10-02 08:27:00.378 2 DEBUG oslo_concurrency.lockutils [req-041f5ef3-6c2b-4242-9d6e-9dfee867ba1a req-e8517637-c39e-430b-aee4-886669e1f0d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:00 compute-0 nova_compute[192567]: 2025-10-02 08:27:00.378 2 DEBUG oslo_concurrency.lockutils [req-041f5ef3-6c2b-4242-9d6e-9dfee867ba1a req-e8517637-c39e-430b-aee4-886669e1f0d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:00 compute-0 nova_compute[192567]: 2025-10-02 08:27:00.378 2 DEBUG nova.compute.manager [req-041f5ef3-6c2b-4242-9d6e-9dfee867ba1a req-e8517637-c39e-430b-aee4-886669e1f0d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] No waiting events found dispatching network-vif-unplugged-3b0883a3-0e37-423b-b7ad-46bd0fa49790 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:27:00 compute-0 nova_compute[192567]: 2025-10-02 08:27:00.379 2 DEBUG nova.compute.manager [req-041f5ef3-6c2b-4242-9d6e-9dfee867ba1a req-e8517637-c39e-430b-aee4-886669e1f0d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Received event network-vif-unplugged-3b0883a3-0e37-423b-b7ad-46bd0fa49790 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:27:01 compute-0 nova_compute[192567]: 2025-10-02 08:27:01.326 2 DEBUG nova.network.neutron [-] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:27:01 compute-0 nova_compute[192567]: 2025-10-02 08:27:01.347 2 INFO nova.compute.manager [-] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Took 1.63 seconds to deallocate network for instance.
Oct 02 08:27:01 compute-0 nova_compute[192567]: 2025-10-02 08:27:01.400 2 DEBUG oslo_concurrency.lockutils [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:01 compute-0 nova_compute[192567]: 2025-10-02 08:27:01.401 2 DEBUG oslo_concurrency.lockutils [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:01 compute-0 openstack_network_exporter[205118]: ERROR   08:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:27:01 compute-0 openstack_network_exporter[205118]: ERROR   08:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:27:01 compute-0 openstack_network_exporter[205118]: ERROR   08:27:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:27:01 compute-0 openstack_network_exporter[205118]: ERROR   08:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:27:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:27:01 compute-0 openstack_network_exporter[205118]: ERROR   08:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:27:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:27:01 compute-0 nova_compute[192567]: 2025-10-02 08:27:01.500 2 DEBUG nova.compute.provider_tree [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:27:01 compute-0 nova_compute[192567]: 2025-10-02 08:27:01.513 2 DEBUG nova.scheduler.client.report [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:27:01 compute-0 anacron[190925]: Job `cron.daily' started
Oct 02 08:27:01 compute-0 anacron[190925]: Job `cron.daily' terminated
Oct 02 08:27:01 compute-0 nova_compute[192567]: 2025-10-02 08:27:01.538 2 DEBUG oslo_concurrency.lockutils [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:01 compute-0 nova_compute[192567]: 2025-10-02 08:27:01.571 2 INFO nova.scheduler.client.report [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Deleted allocations for instance d5de7c96-a157-43c1-b00a-4b54c1f7bb1c
Oct 02 08:27:01 compute-0 nova_compute[192567]: 2025-10-02 08:27:01.613 2 DEBUG nova.compute.manager [req-40e435ff-8dd5-4388-9c1a-36c341c41f10 req-d6b137e3-85ef-4cbe-9364-b1e11365cc3c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Received event network-vif-deleted-3b0883a3-0e37-423b-b7ad-46bd0fa49790 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:27:01 compute-0 nova_compute[192567]: 2025-10-02 08:27:01.666 2 DEBUG oslo_concurrency.lockutils [None req-9ee5b924-09e2-4c25-8291-7b75e7c5f1a6 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:02 compute-0 nova_compute[192567]: 2025-10-02 08:27:02.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:02 compute-0 nova_compute[192567]: 2025-10-02 08:27:02.494 2 DEBUG nova.compute.manager [req-bb87968b-2519-41e6-9210-4c5a4e104e66 req-d73fd53e-abc2-4528-a22c-a732058d27b8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Received event network-vif-plugged-3b0883a3-0e37-423b-b7ad-46bd0fa49790 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:27:02 compute-0 nova_compute[192567]: 2025-10-02 08:27:02.495 2 DEBUG oslo_concurrency.lockutils [req-bb87968b-2519-41e6-9210-4c5a4e104e66 req-d73fd53e-abc2-4528-a22c-a732058d27b8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:02 compute-0 nova_compute[192567]: 2025-10-02 08:27:02.496 2 DEBUG oslo_concurrency.lockutils [req-bb87968b-2519-41e6-9210-4c5a4e104e66 req-d73fd53e-abc2-4528-a22c-a732058d27b8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:02 compute-0 nova_compute[192567]: 2025-10-02 08:27:02.496 2 DEBUG oslo_concurrency.lockutils [req-bb87968b-2519-41e6-9210-4c5a4e104e66 req-d73fd53e-abc2-4528-a22c-a732058d27b8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "d5de7c96-a157-43c1-b00a-4b54c1f7bb1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:02 compute-0 nova_compute[192567]: 2025-10-02 08:27:02.497 2 DEBUG nova.compute.manager [req-bb87968b-2519-41e6-9210-4c5a4e104e66 req-d73fd53e-abc2-4528-a22c-a732058d27b8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] No waiting events found dispatching network-vif-plugged-3b0883a3-0e37-423b-b7ad-46bd0fa49790 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:27:02 compute-0 nova_compute[192567]: 2025-10-02 08:27:02.497 2 WARNING nova.compute.manager [req-bb87968b-2519-41e6-9210-4c5a4e104e66 req-d73fd53e-abc2-4528-a22c-a732058d27b8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Received unexpected event network-vif-plugged-3b0883a3-0e37-423b-b7ad-46bd0fa49790 for instance with vm_state deleted and task_state None.
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.453 2 DEBUG oslo_concurrency.lockutils [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "f7e8d3db-fabd-4d87-9569-a348f1b9788c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.454 2 DEBUG oslo_concurrency.lockutils [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "f7e8d3db-fabd-4d87-9569-a348f1b9788c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.455 2 DEBUG oslo_concurrency.lockutils [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "f7e8d3db-fabd-4d87-9569-a348f1b9788c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.455 2 DEBUG oslo_concurrency.lockutils [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "f7e8d3db-fabd-4d87-9569-a348f1b9788c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.456 2 DEBUG oslo_concurrency.lockutils [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "f7e8d3db-fabd-4d87-9569-a348f1b9788c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.458 2 INFO nova.compute.manager [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Terminating instance
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.460 2 DEBUG nova.compute.manager [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:27:03 compute-0 kernel: tape27cfaea-de (unregistering): left promiscuous mode
Oct 02 08:27:03 compute-0 NetworkManager[51654]: <info>  [1759393623.4994] device (tape27cfaea-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:27:03 compute-0 ovn_controller[94821]: 2025-10-02T08:27:03Z|00158|binding|INFO|Releasing lport e27cfaea-ded9-45c4-87af-3213584cf98e from this chassis (sb_readonly=0)
Oct 02 08:27:03 compute-0 ovn_controller[94821]: 2025-10-02T08:27:03Z|00159|binding|INFO|Setting lport e27cfaea-ded9-45c4-87af-3213584cf98e down in Southbound
Oct 02 08:27:03 compute-0 ovn_controller[94821]: 2025-10-02T08:27:03Z|00160|binding|INFO|Removing iface tape27cfaea-de ovn-installed in OVS
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.525 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:41:a6 10.100.0.11'], port_security=['fa:16:3e:56:41:a6 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f7e8d3db-fabd-4d87-9569-a348f1b9788c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=e27cfaea-ded9-45c4-87af-3213584cf98e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.527 103703 INFO neutron.agent.ovn.metadata.agent [-] Port e27cfaea-ded9-45c4-87af-3213584cf98e in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d unbound from our chassis
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.528 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.531 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[c0fe6d93-336d-4886-a3a2-872389d363d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.532 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d namespace which is not needed anymore
Oct 02 08:27:03 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000011.scope: Deactivated successfully.
Oct 02 08:27:03 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000011.scope: Consumed 2.575s CPU time.
Oct 02 08:27:03 compute-0 systemd-machined[152597]: Machine qemu-14-instance-00000011 terminated.
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:03 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[221672]: [NOTICE]   (221677) : haproxy version is 2.8.14-c23fe91
Oct 02 08:27:03 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[221672]: [NOTICE]   (221677) : path to executable is /usr/sbin/haproxy
Oct 02 08:27:03 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[221672]: [WARNING]  (221677) : Exiting Master process...
Oct 02 08:27:03 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[221672]: [WARNING]  (221677) : Exiting Master process...
Oct 02 08:27:03 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[221672]: [ALERT]    (221677) : Current worker (221679) exited with code 143 (Terminated)
Oct 02 08:27:03 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[221672]: [WARNING]  (221677) : All workers exited. Exiting... (0)
Oct 02 08:27:03 compute-0 systemd[1]: libpod-0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91.scope: Deactivated successfully.
Oct 02 08:27:03 compute-0 podman[222137]: 2025-10-02 08:27:03.742945712 +0000 UTC m=+0.066458448 container died 0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.757 2 INFO nova.virt.libvirt.driver [-] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Instance destroyed successfully.
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.758 2 DEBUG nova.objects.instance [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'resources' on Instance uuid f7e8d3db-fabd-4d87-9569-a348f1b9788c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.775 2 DEBUG nova.virt.libvirt.vif [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T08:25:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1662760619',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1662760619',id=17,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:25:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-lcbhr566',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',clean_attempts='1',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:26:54Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=f7e8d3db-fabd-4d87-9569-a348f1b9788c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e27cfaea-ded9-45c4-87af-3213584cf98e", "address": "fa:16:3e:56:41:a6", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape27cfaea-de", "ovs_interfaceid": "e27cfaea-ded9-45c4-87af-3213584cf98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.775 2 DEBUG nova.network.os_vif_util [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "e27cfaea-ded9-45c4-87af-3213584cf98e", "address": "fa:16:3e:56:41:a6", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape27cfaea-de", "ovs_interfaceid": "e27cfaea-ded9-45c4-87af-3213584cf98e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.776 2 DEBUG nova.network.os_vif_util [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:56:41:a6,bridge_name='br-int',has_traffic_filtering=True,id=e27cfaea-ded9-45c4-87af-3213584cf98e,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape27cfaea-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.776 2 DEBUG os_vif [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:41:a6,bridge_name='br-int',has_traffic_filtering=True,id=e27cfaea-ded9-45c4-87af-3213584cf98e,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape27cfaea-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.779 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape27cfaea-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91-userdata-shm.mount: Deactivated successfully.
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.785 2 INFO os_vif [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:41:a6,bridge_name='br-int',has_traffic_filtering=True,id=e27cfaea-ded9-45c4-87af-3213584cf98e,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape27cfaea-de')
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.787 2 INFO nova.virt.libvirt.driver [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Deleting instance files /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c_del
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.788 2 INFO nova.virt.libvirt.driver [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Deletion of /var/lib/nova/instances/f7e8d3db-fabd-4d87-9569-a348f1b9788c_del complete
Oct 02 08:27:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-1c7a70d6e41bd3f93a0b4b67a855057690728e39903bd66c4c1c54ac8f86d902-merged.mount: Deactivated successfully.
Oct 02 08:27:03 compute-0 podman[222137]: 2025-10-02 08:27:03.802221187 +0000 UTC m=+0.125733913 container cleanup 0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:27:03 compute-0 systemd[1]: libpod-conmon-0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91.scope: Deactivated successfully.
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.855 2 INFO nova.compute.manager [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Took 0.39 seconds to destroy the instance on the hypervisor.
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.856 2 DEBUG oslo.service.loopingcall [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.856 2 DEBUG nova.compute.manager [-] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.856 2 DEBUG nova.network.neutron [-] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:27:03 compute-0 podman[222184]: 2025-10-02 08:27:03.870297804 +0000 UTC m=+0.041467018 container remove 0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.875 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[c970a697-5613-43f1-9319-ebe8f1aaead7]: (4, ('Thu Oct  2 08:27:03 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d (0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91)\n0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91\nThu Oct  2 08:27:03 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d (0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91)\n0e0aed8be7895ce0eeed1b6b7f9a1ffa5b82652bb66f73a0dd8f7a3fb326cc91\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.877 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e177ba0f-7cac-411d-844d-96a7b2feb5c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.878 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:27:03 compute-0 kernel: tap08b16a0c-b0: left promiscuous mode
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:03 compute-0 nova_compute[192567]: 2025-10-02 08:27:03.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.898 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[546db6f5-dc93-4fc2-b223-326587db4d83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.923 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[caccedb3-3f90-490c-a614-6e8c5d25f648]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.925 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[94115e5d-e851-40f4-8bdb-ea510154bde5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.939 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[edd1710e-301e-48a1-beb6-a9ca8ecd9848]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443688, 'reachable_time': 34553, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222199, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d08b16a0c\x2db69f\x2d4a34\x2d9bfe\x2d830099adfe8d.mount: Deactivated successfully.
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.945 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:27:03 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:03.945 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a908d4-b3ab-4cdb-93c2-d62bb8ff5429]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:04 compute-0 nova_compute[192567]: 2025-10-02 08:27:04.551 2 DEBUG nova.compute.manager [req-51cb5e01-2912-4304-bdff-b0382f9c6d7e req-b4fdb3a9-f8df-4f8c-9285-86ccee6ed685 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Received event network-vif-unplugged-e27cfaea-ded9-45c4-87af-3213584cf98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:27:04 compute-0 nova_compute[192567]: 2025-10-02 08:27:04.551 2 DEBUG oslo_concurrency.lockutils [req-51cb5e01-2912-4304-bdff-b0382f9c6d7e req-b4fdb3a9-f8df-4f8c-9285-86ccee6ed685 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "f7e8d3db-fabd-4d87-9569-a348f1b9788c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:04 compute-0 nova_compute[192567]: 2025-10-02 08:27:04.551 2 DEBUG oslo_concurrency.lockutils [req-51cb5e01-2912-4304-bdff-b0382f9c6d7e req-b4fdb3a9-f8df-4f8c-9285-86ccee6ed685 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f7e8d3db-fabd-4d87-9569-a348f1b9788c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:04 compute-0 nova_compute[192567]: 2025-10-02 08:27:04.551 2 DEBUG oslo_concurrency.lockutils [req-51cb5e01-2912-4304-bdff-b0382f9c6d7e req-b4fdb3a9-f8df-4f8c-9285-86ccee6ed685 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f7e8d3db-fabd-4d87-9569-a348f1b9788c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:04 compute-0 nova_compute[192567]: 2025-10-02 08:27:04.551 2 DEBUG nova.compute.manager [req-51cb5e01-2912-4304-bdff-b0382f9c6d7e req-b4fdb3a9-f8df-4f8c-9285-86ccee6ed685 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] No waiting events found dispatching network-vif-unplugged-e27cfaea-ded9-45c4-87af-3213584cf98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:27:04 compute-0 nova_compute[192567]: 2025-10-02 08:27:04.552 2 DEBUG nova.compute.manager [req-51cb5e01-2912-4304-bdff-b0382f9c6d7e req-b4fdb3a9-f8df-4f8c-9285-86ccee6ed685 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Received event network-vif-unplugged-e27cfaea-ded9-45c4-87af-3213584cf98e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:27:05 compute-0 nova_compute[192567]: 2025-10-02 08:27:05.494 2 DEBUG nova.network.neutron [-] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:27:05 compute-0 nova_compute[192567]: 2025-10-02 08:27:05.511 2 INFO nova.compute.manager [-] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Took 1.65 seconds to deallocate network for instance.
Oct 02 08:27:05 compute-0 nova_compute[192567]: 2025-10-02 08:27:05.557 2 DEBUG oslo_concurrency.lockutils [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:05 compute-0 nova_compute[192567]: 2025-10-02 08:27:05.558 2 DEBUG oslo_concurrency.lockutils [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:05 compute-0 nova_compute[192567]: 2025-10-02 08:27:05.565 2 DEBUG oslo_concurrency.lockutils [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:05 compute-0 nova_compute[192567]: 2025-10-02 08:27:05.629 2 INFO nova.scheduler.client.report [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Deleted allocations for instance f7e8d3db-fabd-4d87-9569-a348f1b9788c
Oct 02 08:27:05 compute-0 nova_compute[192567]: 2025-10-02 08:27:05.730 2 DEBUG oslo_concurrency.lockutils [None req-e6664f28-acf9-4bf0-9279-d85525fef6d2 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "f7e8d3db-fabd-4d87-9569-a348f1b9788c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:06 compute-0 nova_compute[192567]: 2025-10-02 08:27:06.674 2 DEBUG nova.compute.manager [req-c6287d66-a567-4c21-a352-d3cdffb42c95 req-8c1b70b3-43f3-49ff-a0b6-33525920b7d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Received event network-vif-plugged-e27cfaea-ded9-45c4-87af-3213584cf98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:27:06 compute-0 nova_compute[192567]: 2025-10-02 08:27:06.675 2 DEBUG oslo_concurrency.lockutils [req-c6287d66-a567-4c21-a352-d3cdffb42c95 req-8c1b70b3-43f3-49ff-a0b6-33525920b7d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "f7e8d3db-fabd-4d87-9569-a348f1b9788c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:06 compute-0 nova_compute[192567]: 2025-10-02 08:27:06.676 2 DEBUG oslo_concurrency.lockutils [req-c6287d66-a567-4c21-a352-d3cdffb42c95 req-8c1b70b3-43f3-49ff-a0b6-33525920b7d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f7e8d3db-fabd-4d87-9569-a348f1b9788c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:06 compute-0 nova_compute[192567]: 2025-10-02 08:27:06.676 2 DEBUG oslo_concurrency.lockutils [req-c6287d66-a567-4c21-a352-d3cdffb42c95 req-8c1b70b3-43f3-49ff-a0b6-33525920b7d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f7e8d3db-fabd-4d87-9569-a348f1b9788c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:06 compute-0 nova_compute[192567]: 2025-10-02 08:27:06.677 2 DEBUG nova.compute.manager [req-c6287d66-a567-4c21-a352-d3cdffb42c95 req-8c1b70b3-43f3-49ff-a0b6-33525920b7d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] No waiting events found dispatching network-vif-plugged-e27cfaea-ded9-45c4-87af-3213584cf98e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:27:06 compute-0 nova_compute[192567]: 2025-10-02 08:27:06.677 2 WARNING nova.compute.manager [req-c6287d66-a567-4c21-a352-d3cdffb42c95 req-8c1b70b3-43f3-49ff-a0b6-33525920b7d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Received unexpected event network-vif-plugged-e27cfaea-ded9-45c4-87af-3213584cf98e for instance with vm_state deleted and task_state None.
Oct 02 08:27:06 compute-0 nova_compute[192567]: 2025-10-02 08:27:06.678 2 DEBUG nova.compute.manager [req-c6287d66-a567-4c21-a352-d3cdffb42c95 req-8c1b70b3-43f3-49ff-a0b6-33525920b7d6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Received event network-vif-deleted-e27cfaea-ded9-45c4-87af-3213584cf98e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:27:07 compute-0 nova_compute[192567]: 2025-10-02 08:27:07.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:08 compute-0 nova_compute[192567]: 2025-10-02 08:27:08.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:09 compute-0 podman[222200]: 2025-10-02 08:27:09.19203216 +0000 UTC m=+0.099211006 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 02 08:27:09 compute-0 podman[222203]: 2025-10-02 08:27:09.21149947 +0000 UTC m=+0.099632720 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 02 08:27:09 compute-0 podman[222202]: 2025-10-02 08:27:09.220859109 +0000 UTC m=+0.113294421 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Oct 02 08:27:09 compute-0 podman[222201]: 2025-10-02 08:27:09.25565575 +0000 UTC m=+0.156579424 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct 02 08:27:12 compute-0 nova_compute[192567]: 2025-10-02 08:27:12.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:13 compute-0 nova_compute[192567]: 2025-10-02 08:27:13.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:14 compute-0 nova_compute[192567]: 2025-10-02 08:27:14.620 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759393619.6200507, d5de7c96-a157-43c1-b00a-4b54c1f7bb1c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:27:14 compute-0 nova_compute[192567]: 2025-10-02 08:27:14.621 2 INFO nova.compute.manager [-] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] VM Stopped (Lifecycle Event)
Oct 02 08:27:14 compute-0 nova_compute[192567]: 2025-10-02 08:27:14.649 2 DEBUG nova.compute.manager [None req-207fb4ce-6ba9-40af-9ccc-7fbfb3cbd700 - - - - - -] [instance: d5de7c96-a157-43c1-b00a-4b54c1f7bb1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:27:15 compute-0 podman[222281]: 2025-10-02 08:27:15.17333637 +0000 UTC m=+0.083707269 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:27:17 compute-0 nova_compute[192567]: 2025-10-02 08:27:17.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:18 compute-0 nova_compute[192567]: 2025-10-02 08:27:18.753 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759393623.7495275, f7e8d3db-fabd-4d87-9569-a348f1b9788c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:27:18 compute-0 nova_compute[192567]: 2025-10-02 08:27:18.754 2 INFO nova.compute.manager [-] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] VM Stopped (Lifecycle Event)
Oct 02 08:27:18 compute-0 nova_compute[192567]: 2025-10-02 08:27:18.791 2 DEBUG nova.compute.manager [None req-4ae65dd4-fa5d-423e-bb34-371336906f87 - - - - - -] [instance: f7e8d3db-fabd-4d87-9569-a348f1b9788c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:27:18 compute-0 nova_compute[192567]: 2025-10-02 08:27:18.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:22 compute-0 nova_compute[192567]: 2025-10-02 08:27:22.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:23 compute-0 nova_compute[192567]: 2025-10-02 08:27:23.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:27 compute-0 nova_compute[192567]: 2025-10-02 08:27:27.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:28 compute-0 podman[222308]: 2025-10-02 08:27:28.19665388 +0000 UTC m=+0.097319579 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Oct 02 08:27:28 compute-0 nova_compute[192567]: 2025-10-02 08:27:28.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:29 compute-0 podman[203011]: time="2025-10-02T08:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:27:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:27:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3007 "" "Go-http-client/1.1"
Oct 02 08:27:31 compute-0 openstack_network_exporter[205118]: ERROR   08:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:27:31 compute-0 openstack_network_exporter[205118]: ERROR   08:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:27:31 compute-0 openstack_network_exporter[205118]: ERROR   08:27:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:27:31 compute-0 openstack_network_exporter[205118]: ERROR   08:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:27:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:27:31 compute-0 openstack_network_exporter[205118]: ERROR   08:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:27:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:27:32 compute-0 nova_compute[192567]: 2025-10-02 08:27:32.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:33 compute-0 nova_compute[192567]: 2025-10-02 08:27:33.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:36 compute-0 nova_compute[192567]: 2025-10-02 08:27:36.642 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:27:36 compute-0 nova_compute[192567]: 2025-10-02 08:27:36.642 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:27:36 compute-0 nova_compute[192567]: 2025-10-02 08:27:36.642 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:27:36 compute-0 nova_compute[192567]: 2025-10-02 08:27:36.656 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:27:37 compute-0 nova_compute[192567]: 2025-10-02 08:27:37.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:38 compute-0 nova_compute[192567]: 2025-10-02 08:27:38.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:27:38 compute-0 nova_compute[192567]: 2025-10-02 08:27:38.654 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:38 compute-0 nova_compute[192567]: 2025-10-02 08:27:38.654 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:38 compute-0 nova_compute[192567]: 2025-10-02 08:27:38.655 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:38 compute-0 nova_compute[192567]: 2025-10-02 08:27:38.655 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:27:38 compute-0 nova_compute[192567]: 2025-10-02 08:27:38.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:38 compute-0 nova_compute[192567]: 2025-10-02 08:27:38.852 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:27:38 compute-0 nova_compute[192567]: 2025-10-02 08:27:38.853 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5874MB free_disk=73.46508407592773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:27:38 compute-0 nova_compute[192567]: 2025-10-02 08:27:38.854 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:38 compute-0 nova_compute[192567]: 2025-10-02 08:27:38.854 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:38 compute-0 nova_compute[192567]: 2025-10-02 08:27:38.924 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:27:38 compute-0 nova_compute[192567]: 2025-10-02 08:27:38.924 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:27:39 compute-0 nova_compute[192567]: 2025-10-02 08:27:39.019 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:27:39 compute-0 nova_compute[192567]: 2025-10-02 08:27:39.037 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:27:39 compute-0 nova_compute[192567]: 2025-10-02 08:27:39.066 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:27:39 compute-0 nova_compute[192567]: 2025-10-02 08:27:39.066 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:40 compute-0 nova_compute[192567]: 2025-10-02 08:27:40.063 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:27:40 compute-0 nova_compute[192567]: 2025-10-02 08:27:40.064 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:27:40 compute-0 podman[222335]: 2025-10-02 08:27:40.216584966 +0000 UTC m=+0.086904568 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:27:40 compute-0 podman[222336]: 2025-10-02 08:27:40.22226951 +0000 UTC m=+0.089412074 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 02 08:27:40 compute-0 podman[222333]: 2025-10-02 08:27:40.228193713 +0000 UTC m=+0.110221886 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:27:40 compute-0 podman[222334]: 2025-10-02 08:27:40.230452853 +0000 UTC m=+0.112332781 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller)
Oct 02 08:27:40 compute-0 nova_compute[192567]: 2025-10-02 08:27:40.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:27:40 compute-0 nova_compute[192567]: 2025-10-02 08:27:40.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:27:42 compute-0 nova_compute[192567]: 2025-10-02 08:27:42.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:43 compute-0 nova_compute[192567]: 2025-10-02 08:27:43.565 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:43 compute-0 nova_compute[192567]: 2025-10-02 08:27:43.566 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:43 compute-0 nova_compute[192567]: 2025-10-02 08:27:43.587 2 DEBUG nova.compute.manager [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 02 08:27:43 compute-0 nova_compute[192567]: 2025-10-02 08:27:43.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:27:43 compute-0 nova_compute[192567]: 2025-10-02 08:27:43.665 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:27:43 compute-0 nova_compute[192567]: 2025-10-02 08:27:43.695 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:43 compute-0 nova_compute[192567]: 2025-10-02 08:27:43.696 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:43 compute-0 nova_compute[192567]: 2025-10-02 08:27:43.704 2 DEBUG nova.virt.hardware [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:27:43 compute-0 nova_compute[192567]: 2025-10-02 08:27:43.705 2 INFO nova.compute.claims [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:27:43 compute-0 nova_compute[192567]: 2025-10-02 08:27:43.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.085 2 DEBUG nova.compute.provider_tree [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.103 2 DEBUG nova.scheduler.client.report [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.127 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.432s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.129 2 DEBUG nova.compute.manager [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.179 2 DEBUG nova.compute.manager [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.180 2 DEBUG nova.network.neutron [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.204 2 INFO nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.230 2 DEBUG nova.compute.manager [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.474 2 DEBUG nova.compute.manager [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.476 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.477 2 INFO nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Creating image(s)
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.478 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "/var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.479 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "/var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.480 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "/var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.510 2 DEBUG oslo_concurrency.processutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.611 2 DEBUG oslo_concurrency.processutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.613 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.614 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.638 2 DEBUG oslo_concurrency.processutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.734 2 DEBUG oslo_concurrency.processutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.736 2 DEBUG oslo_concurrency.processutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.787 2 DEBUG oslo_concurrency.processutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.790 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.791 2 DEBUG oslo_concurrency.processutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.873 2 DEBUG oslo_concurrency.processutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.875 2 DEBUG nova.virt.disk.api [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Checking if we can resize image /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.876 2 DEBUG oslo_concurrency.processutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.931 2 DEBUG oslo_concurrency.processutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.932 2 DEBUG nova.virt.disk.api [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Cannot resize image /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.933 2 DEBUG nova.objects.instance [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'migration_context' on Instance uuid 09ce154f-a8dc-447f-9b90-c08e3249d3e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.947 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.948 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Ensure instance console log exists: /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.948 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.948 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:44 compute-0 nova_compute[192567]: 2025-10-02 08:27:44.948 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:45 compute-0 nova_compute[192567]: 2025-10-02 08:27:45.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:27:45 compute-0 nova_compute[192567]: 2025-10-02 08:27:45.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:27:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:45.986 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:45.987 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:45.987 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:46 compute-0 podman[222429]: 2025-10-02 08:27:46.174795795 +0000 UTC m=+0.075466206 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:27:47 compute-0 nova_compute[192567]: 2025-10-02 08:27:47.287 2 DEBUG nova.network.neutron [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Successfully created port: 45c35630-cf9e-45fc-b081-384e2a1425de _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 02 08:27:47 compute-0 nova_compute[192567]: 2025-10-02 08:27:47.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:48 compute-0 nova_compute[192567]: 2025-10-02 08:27:48.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:49 compute-0 nova_compute[192567]: 2025-10-02 08:27:49.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:27:50 compute-0 nova_compute[192567]: 2025-10-02 08:27:50.222 2 DEBUG nova.network.neutron [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Successfully updated port: 45c35630-cf9e-45fc-b081-384e2a1425de _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 02 08:27:50 compute-0 nova_compute[192567]: 2025-10-02 08:27:50.242 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "refresh_cache-09ce154f-a8dc-447f-9b90-c08e3249d3e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:27:50 compute-0 nova_compute[192567]: 2025-10-02 08:27:50.242 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquired lock "refresh_cache-09ce154f-a8dc-447f-9b90-c08e3249d3e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:27:50 compute-0 nova_compute[192567]: 2025-10-02 08:27:50.243 2 DEBUG nova.network.neutron [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:27:50 compute-0 nova_compute[192567]: 2025-10-02 08:27:50.324 2 DEBUG nova.compute.manager [req-8c6b496d-1e1b-4a01-ab86-13dc80b6e0c2 req-a71d6c1c-092c-418d-8f6f-17701765e189 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-changed-45c35630-cf9e-45fc-b081-384e2a1425de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:27:50 compute-0 nova_compute[192567]: 2025-10-02 08:27:50.325 2 DEBUG nova.compute.manager [req-8c6b496d-1e1b-4a01-ab86-13dc80b6e0c2 req-a71d6c1c-092c-418d-8f6f-17701765e189 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Refreshing instance network info cache due to event network-changed-45c35630-cf9e-45fc-b081-384e2a1425de. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:27:50 compute-0 nova_compute[192567]: 2025-10-02 08:27:50.325 2 DEBUG oslo_concurrency.lockutils [req-8c6b496d-1e1b-4a01-ab86-13dc80b6e0c2 req-a71d6c1c-092c-418d-8f6f-17701765e189 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-09ce154f-a8dc-447f-9b90-c08e3249d3e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:27:50 compute-0 nova_compute[192567]: 2025-10-02 08:27:50.437 2 DEBUG nova.network.neutron [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.206 2 DEBUG nova.network.neutron [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Updating instance_info_cache with network_info: [{"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.226 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Releasing lock "refresh_cache-09ce154f-a8dc-447f-9b90-c08e3249d3e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.227 2 DEBUG nova.compute.manager [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Instance network_info: |[{"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.227 2 DEBUG oslo_concurrency.lockutils [req-8c6b496d-1e1b-4a01-ab86-13dc80b6e0c2 req-a71d6c1c-092c-418d-8f6f-17701765e189 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-09ce154f-a8dc-447f-9b90-c08e3249d3e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.228 2 DEBUG nova.network.neutron [req-8c6b496d-1e1b-4a01-ab86-13dc80b6e0c2 req-a71d6c1c-092c-418d-8f6f-17701765e189 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Refreshing network info cache for port 45c35630-cf9e-45fc-b081-384e2a1425de _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.232 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Start _get_guest_xml network_info=[{"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.239 2 WARNING nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.250 2 DEBUG nova.virt.libvirt.host [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.250 2 DEBUG nova.virt.libvirt.host [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.255 2 DEBUG nova.virt.libvirt.host [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.256 2 DEBUG nova.virt.libvirt.host [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.257 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.257 2 DEBUG nova.virt.hardware [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.258 2 DEBUG nova.virt.hardware [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.258 2 DEBUG nova.virt.hardware [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.259 2 DEBUG nova.virt.hardware [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.259 2 DEBUG nova.virt.hardware [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.260 2 DEBUG nova.virt.hardware [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.260 2 DEBUG nova.virt.hardware [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.261 2 DEBUG nova.virt.hardware [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.261 2 DEBUG nova.virt.hardware [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.262 2 DEBUG nova.virt.hardware [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.262 2 DEBUG nova.virt.hardware [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.268 2 DEBUG nova.virt.libvirt.vif [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:27:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-384391495',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-384391495',id=20,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-7nf09cwv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:27:44Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=09ce154f-a8dc-447f-9b90-c08e3249d3e3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.268 2 DEBUG nova.network.os_vif_util [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.270 2 DEBUG nova.network.os_vif_util [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:38:b8,bridge_name='br-int',has_traffic_filtering=True,id=45c35630-cf9e-45fc-b081-384e2a1425de,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45c35630-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.271 2 DEBUG nova.objects.instance [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'pci_devices' on Instance uuid 09ce154f-a8dc-447f-9b90-c08e3249d3e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.289 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:27:52 compute-0 nova_compute[192567]:   <uuid>09ce154f-a8dc-447f-9b90-c08e3249d3e3</uuid>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   <name>instance-00000014</name>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteStrategies-server-384391495</nova:name>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:27:52</nova:creationTime>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:27:52 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:27:52 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:27:52 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:27:52 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:27:52 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:27:52 compute-0 nova_compute[192567]:         <nova:user uuid="bf38fbc8dd7b4c4db6c469a7951b0942">tempest-TestExecuteStrategies-1382092507-project-admin</nova:user>
Oct 02 08:27:52 compute-0 nova_compute[192567]:         <nova:project uuid="1ea832b474574009921dff909e4daeaf">tempest-TestExecuteStrategies-1382092507</nova:project>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:27:52 compute-0 nova_compute[192567]:         <nova:port uuid="45c35630-cf9e-45fc-b081-384e2a1425de">
Oct 02 08:27:52 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <system>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <entry name="serial">09ce154f-a8dc-447f-9b90-c08e3249d3e3</entry>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <entry name="uuid">09ce154f-a8dc-447f-9b90-c08e3249d3e3</entry>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     </system>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   <os>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   </os>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   <features>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   </features>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk.config"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:68:38:b8"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <target dev="tap45c35630-cf"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/console.log" append="off"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <video>
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     </video>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:27:52 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:27:52 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:27:52 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:27:52 compute-0 nova_compute[192567]: </domain>
Oct 02 08:27:52 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.291 2 DEBUG nova.compute.manager [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Preparing to wait for external event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.291 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.292 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.292 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.293 2 DEBUG nova.virt.libvirt.vif [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:27:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-384391495',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-384391495',id=20,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-7nf09cwv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:27:44Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=09ce154f-a8dc-447f-9b90-c08e3249d3e3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.294 2 DEBUG nova.network.os_vif_util [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.294 2 DEBUG nova.network.os_vif_util [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:38:b8,bridge_name='br-int',has_traffic_filtering=True,id=45c35630-cf9e-45fc-b081-384e2a1425de,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45c35630-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.295 2 DEBUG os_vif [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:38:b8,bridge_name='br-int',has_traffic_filtering=True,id=45c35630-cf9e-45fc-b081-384e2a1425de,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45c35630-cf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.296 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.297 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.301 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap45c35630-cf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.301 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap45c35630-cf, col_values=(('external_ids', {'iface-id': '45c35630-cf9e-45fc-b081-384e2a1425de', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:68:38:b8', 'vm-uuid': '09ce154f-a8dc-447f-9b90-c08e3249d3e3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:27:52 compute-0 NetworkManager[51654]: <info>  [1759393672.3052] manager: (tap45c35630-cf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.314 2 INFO os_vif [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:38:b8,bridge_name='br-int',has_traffic_filtering=True,id=45c35630-cf9e-45fc-b081-384e2a1425de,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45c35630-cf')
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.377 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.378 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.378 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] No VIF found with MAC fa:16:3e:68:38:b8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.379 2 INFO nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Using config drive
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.770 2 INFO nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Creating config drive at /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk.config
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.777 2 DEBUG oslo_concurrency.processutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn1gfwo5x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:27:52 compute-0 nova_compute[192567]: 2025-10-02 08:27:52.918 2 DEBUG oslo_concurrency.processutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn1gfwo5x" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:27:53 compute-0 kernel: tap45c35630-cf: entered promiscuous mode
Oct 02 08:27:53 compute-0 NetworkManager[51654]: <info>  [1759393673.0207] manager: (tap45c35630-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Oct 02 08:27:53 compute-0 ovn_controller[94821]: 2025-10-02T08:27:53Z|00161|binding|INFO|Claiming lport 45c35630-cf9e-45fc-b081-384e2a1425de for this chassis.
Oct 02 08:27:53 compute-0 ovn_controller[94821]: 2025-10-02T08:27:53Z|00162|binding|INFO|45c35630-cf9e-45fc-b081-384e2a1425de: Claiming fa:16:3e:68:38:b8 10.100.0.10
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.028 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:38:b8 10.100.0.10'], port_security=['fa:16:3e:68:38:b8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '09ce154f-a8dc-447f-9b90-c08e3249d3e3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=45c35630-cf9e-45fc-b081-384e2a1425de) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.031 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 45c35630-cf9e-45fc-b081-384e2a1425de in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d bound to our chassis
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.033 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:27:53 compute-0 ovn_controller[94821]: 2025-10-02T08:27:53Z|00163|binding|INFO|Setting lport 45c35630-cf9e-45fc-b081-384e2a1425de ovn-installed in OVS
Oct 02 08:27:53 compute-0 ovn_controller[94821]: 2025-10-02T08:27:53Z|00164|binding|INFO|Setting lport 45c35630-cf9e-45fc-b081-384e2a1425de up in Southbound
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.054 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[405848f9-11a5-431f-b735-bf6e76f27c67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.056 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap08b16a0c-b1 in ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.062 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap08b16a0c-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.062 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[44ded8bd-f5fc-480d-a884-53eba55e0cec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.063 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f9d875e7-743d-4d74-8124-6330d9a411b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 systemd-udevd[222474]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.081 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[bdfc062e-6f38-456a-ac86-f7aee709c8a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 systemd-machined[152597]: New machine qemu-15-instance-00000014.
Oct 02 08:27:53 compute-0 NetworkManager[51654]: <info>  [1759393673.0953] device (tap45c35630-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:27:53 compute-0 NetworkManager[51654]: <info>  [1759393673.0966] device (tap45c35630-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.103 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[68d3c9e5-114b-43e0-a19a-68d9f83c2782]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000014.
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.140 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[f704a312-2f0f-4844-8e7d-c7bb021badf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.149 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[29fcbe6a-f151-4ec2-a8d3-f3a4bb23865b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 NetworkManager[51654]: <info>  [1759393673.1527] manager: (tap08b16a0c-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.197 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[df0609c5-76a5-459a-ba72-7d85dc7f00f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.199 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[21cd80ed-aac8-487a-9a75-df554dfe8b89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 NetworkManager[51654]: <info>  [1759393673.2316] device (tap08b16a0c-b0): carrier: link connected
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.241 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[b9987fa1-1e59-42f8-9f59-853e70066816]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.267 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[fe5c2a50-eb4a-4c8d-80b3-b2efbb4bc92c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456082, 'reachable_time': 41513, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222505, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.287 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[0a9bc297-30b9-421f-8d00-33ca226e0bd6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:c53f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 456082, 'tstamp': 456082}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222506, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.309 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[dd48180d-ff2e-4715-89df-18d192315861]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456082, 'reachable_time': 41513, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222507, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.357 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f4f34e-f4e9-416e-b95c-aa884c0c9997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.371 2 DEBUG nova.compute.manager [req-99d44320-0173-4587-ab32-41e9b38f71b2 req-d16caccd-78de-4054-ba5d-6da03acf8566 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.372 2 DEBUG oslo_concurrency.lockutils [req-99d44320-0173-4587-ab32-41e9b38f71b2 req-d16caccd-78de-4054-ba5d-6da03acf8566 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.372 2 DEBUG oslo_concurrency.lockutils [req-99d44320-0173-4587-ab32-41e9b38f71b2 req-d16caccd-78de-4054-ba5d-6da03acf8566 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.372 2 DEBUG oslo_concurrency.lockutils [req-99d44320-0173-4587-ab32-41e9b38f71b2 req-d16caccd-78de-4054-ba5d-6da03acf8566 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.372 2 DEBUG nova.compute.manager [req-99d44320-0173-4587-ab32-41e9b38f71b2 req-d16caccd-78de-4054-ba5d-6da03acf8566 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Processing event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.438 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f5258cb5-740b-47e1-a6d3-f5a954391778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.439 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.439 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.440 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b16a0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:53 compute-0 NetworkManager[51654]: <info>  [1759393673.4429] manager: (tap08b16a0c-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Oct 02 08:27:53 compute-0 kernel: tap08b16a0c-b0: entered promiscuous mode
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.445 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08b16a0c-b0, col_values=(('external_ids', {'iface-id': '748eef31-77a8-4b04-b6b7-dc0f7cc1cf65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:53 compute-0 ovn_controller[94821]: 2025-10-02T08:27:53Z|00165|binding|INFO|Releasing lport 748eef31-77a8-4b04-b6b7-dc0f7cc1cf65 from this chassis (sb_readonly=0)
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.448 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.449 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[6f539751-f165-42a7-9fca-79c9df4cef4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.450 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:27:53 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:27:53.451 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'env', 'PROCESS_TAG=haproxy-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/08b16a0c-b69f-4a34-9bfe-830099adfe8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:53 compute-0 podman[222546]: 2025-10-02 08:27:53.897650574 +0000 UTC m=+0.080461059 container create 5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 02 08:27:53 compute-0 systemd[1]: Started libpod-conmon-5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74.scope.
Oct 02 08:27:53 compute-0 podman[222546]: 2025-10-02 08:27:53.861868092 +0000 UTC m=+0.044678627 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:27:53 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:27:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0fa7b65d71484909306df798ff586dc4ca7bd5218f2fa75183f16ffffde3e745/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.994 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393673.9933262, 09ce154f-a8dc-447f-9b90-c08e3249d3e3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.994 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] VM Started (Lifecycle Event)
Oct 02 08:27:53 compute-0 nova_compute[192567]: 2025-10-02 08:27:53.998 2 DEBUG nova.compute.manager [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:27:54 compute-0 podman[222546]: 2025-10-02 08:27:54.002655158 +0000 UTC m=+0.185465713 container init 5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.004 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.008 2 INFO nova.virt.libvirt.driver [-] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Instance spawned successfully.
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.008 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 02 08:27:54 compute-0 podman[222546]: 2025-10-02 08:27:54.009209009 +0000 UTC m=+0.192019494 container start 5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.023 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:27:54 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[222562]: [NOTICE]   (222566) : New worker (222568) forked
Oct 02 08:27:54 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[222562]: [NOTICE]   (222566) : Loading success.
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.033 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.039 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.039 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.040 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.041 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.041 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.042 2 DEBUG nova.virt.libvirt.driver [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.067 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.068 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393673.9935124, 09ce154f-a8dc-447f-9b90-c08e3249d3e3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.068 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] VM Paused (Lifecycle Event)
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.096 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.100 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393674.002895, 09ce154f-a8dc-447f-9b90-c08e3249d3e3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.100 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] VM Resumed (Lifecycle Event)
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.109 2 INFO nova.compute.manager [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Took 9.63 seconds to spawn the instance on the hypervisor.
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.110 2 DEBUG nova.compute.manager [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.137 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.141 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.161 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.173 2 INFO nova.compute.manager [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Took 10.51 seconds to build instance.
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.177 2 DEBUG nova.network.neutron [req-8c6b496d-1e1b-4a01-ab86-13dc80b6e0c2 req-a71d6c1c-092c-418d-8f6f-17701765e189 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Updated VIF entry in instance network info cache for port 45c35630-cf9e-45fc-b081-384e2a1425de. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.177 2 DEBUG nova.network.neutron [req-8c6b496d-1e1b-4a01-ab86-13dc80b6e0c2 req-a71d6c1c-092c-418d-8f6f-17701765e189 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Updating instance_info_cache with network_info: [{"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.197 2 DEBUG oslo_concurrency.lockutils [req-8c6b496d-1e1b-4a01-ab86-13dc80b6e0c2 req-a71d6c1c-092c-418d-8f6f-17701765e189 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-09ce154f-a8dc-447f-9b90-c08e3249d3e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:27:54 compute-0 nova_compute[192567]: 2025-10-02 08:27:54.198 2 DEBUG oslo_concurrency.lockutils [None req-0f658e0d-4b76-4d15-b34a-7f25e074419e bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:55 compute-0 nova_compute[192567]: 2025-10-02 08:27:55.457 2 DEBUG nova.compute.manager [req-359f9a9e-467d-4a18-8609-af118cf74cf2 req-c2ef2326-fc86-4638-a20c-00d7abe4550d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:27:55 compute-0 nova_compute[192567]: 2025-10-02 08:27:55.458 2 DEBUG oslo_concurrency.lockutils [req-359f9a9e-467d-4a18-8609-af118cf74cf2 req-c2ef2326-fc86-4638-a20c-00d7abe4550d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:27:55 compute-0 nova_compute[192567]: 2025-10-02 08:27:55.458 2 DEBUG oslo_concurrency.lockutils [req-359f9a9e-467d-4a18-8609-af118cf74cf2 req-c2ef2326-fc86-4638-a20c-00d7abe4550d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:27:55 compute-0 nova_compute[192567]: 2025-10-02 08:27:55.459 2 DEBUG oslo_concurrency.lockutils [req-359f9a9e-467d-4a18-8609-af118cf74cf2 req-c2ef2326-fc86-4638-a20c-00d7abe4550d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:27:55 compute-0 nova_compute[192567]: 2025-10-02 08:27:55.459 2 DEBUG nova.compute.manager [req-359f9a9e-467d-4a18-8609-af118cf74cf2 req-c2ef2326-fc86-4638-a20c-00d7abe4550d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] No waiting events found dispatching network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:27:55 compute-0 nova_compute[192567]: 2025-10-02 08:27:55.459 2 WARNING nova.compute.manager [req-359f9a9e-467d-4a18-8609-af118cf74cf2 req-c2ef2326-fc86-4638-a20c-00d7abe4550d 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received unexpected event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de for instance with vm_state active and task_state None.
Oct 02 08:27:57 compute-0 nova_compute[192567]: 2025-10-02 08:27:57.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:57 compute-0 nova_compute[192567]: 2025-10-02 08:27:57.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:27:59 compute-0 podman[222577]: 2025-10-02 08:27:59.166820761 +0000 UTC m=+0.079214561 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public)
Oct 02 08:27:59 compute-0 podman[203011]: time="2025-10-02T08:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:27:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:27:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3467 "" "Go-http-client/1.1"
Oct 02 08:28:01 compute-0 openstack_network_exporter[205118]: ERROR   08:28:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:28:01 compute-0 openstack_network_exporter[205118]: ERROR   08:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:28:01 compute-0 openstack_network_exporter[205118]: ERROR   08:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:28:01 compute-0 openstack_network_exporter[205118]: ERROR   08:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:28:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:28:01 compute-0 openstack_network_exporter[205118]: ERROR   08:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:28:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:28:02 compute-0 nova_compute[192567]: 2025-10-02 08:28:02.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:02 compute-0 nova_compute[192567]: 2025-10-02 08:28:02.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:06 compute-0 ovn_controller[94821]: 2025-10-02T08:28:06Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:68:38:b8 10.100.0.10
Oct 02 08:28:06 compute-0 ovn_controller[94821]: 2025-10-02T08:28:06Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:68:38:b8 10.100.0.10
Oct 02 08:28:07 compute-0 nova_compute[192567]: 2025-10-02 08:28:07.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:07 compute-0 nova_compute[192567]: 2025-10-02 08:28:07.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:11 compute-0 podman[222612]: 2025-10-02 08:28:11.170204187 +0000 UTC m=+0.070980387 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct 02 08:28:11 compute-0 podman[222610]: 2025-10-02 08:28:11.17840466 +0000 UTC m=+0.085212156 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:28:11 compute-0 podman[222613]: 2025-10-02 08:28:11.179296807 +0000 UTC m=+0.078667984 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 02 08:28:11 compute-0 podman[222611]: 2025-10-02 08:28:11.246674322 +0000 UTC m=+0.147237935 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct 02 08:28:12 compute-0 nova_compute[192567]: 2025-10-02 08:28:12.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:12 compute-0 nova_compute[192567]: 2025-10-02 08:28:12.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:17 compute-0 podman[222687]: 2025-10-02 08:28:17.178060744 +0000 UTC m=+0.079718735 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:28:17 compute-0 nova_compute[192567]: 2025-10-02 08:28:17.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:17 compute-0 nova_compute[192567]: 2025-10-02 08:28:17.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:22 compute-0 nova_compute[192567]: 2025-10-02 08:28:22.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:22 compute-0 nova_compute[192567]: 2025-10-02 08:28:22.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:23 compute-0 ovn_controller[94821]: 2025-10-02T08:28:23Z|00166|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Oct 02 08:28:27 compute-0 nova_compute[192567]: 2025-10-02 08:28:27.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:27 compute-0 nova_compute[192567]: 2025-10-02 08:28:27.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:29 compute-0 podman[203011]: time="2025-10-02T08:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:28:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:28:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3476 "" "Go-http-client/1.1"
Oct 02 08:28:30 compute-0 podman[222712]: 2025-10-02 08:28:30.192335496 +0000 UTC m=+0.096207924 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350)
Oct 02 08:28:31 compute-0 openstack_network_exporter[205118]: ERROR   08:28:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:28:31 compute-0 openstack_network_exporter[205118]: ERROR   08:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:28:31 compute-0 openstack_network_exporter[205118]: ERROR   08:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:28:31 compute-0 openstack_network_exporter[205118]: ERROR   08:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:28:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:28:31 compute-0 openstack_network_exporter[205118]: ERROR   08:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:28:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:28:32 compute-0 nova_compute[192567]: 2025-10-02 08:28:32.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:32 compute-0 nova_compute[192567]: 2025-10-02 08:28:32.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:37 compute-0 nova_compute[192567]: 2025-10-02 08:28:37.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:37 compute-0 nova_compute[192567]: 2025-10-02 08:28:37.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:38 compute-0 nova_compute[192567]: 2025-10-02 08:28:38.308 2 DEBUG nova.compute.manager [None req-2a45b393-7706-4dc1-8a7e-aa55a8c50aa9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Oct 02 08:28:38 compute-0 nova_compute[192567]: 2025-10-02 08:28:38.374 2 DEBUG nova.compute.provider_tree [None req-2a45b393-7706-4dc1-8a7e-aa55a8c50aa9 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Updating resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e generation from 27 to 31 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 02 08:28:38 compute-0 nova_compute[192567]: 2025-10-02 08:28:38.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:28:38 compute-0 nova_compute[192567]: 2025-10-02 08:28:38.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:28:38 compute-0 nova_compute[192567]: 2025-10-02 08:28:38.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:28:39 compute-0 nova_compute[192567]: 2025-10-02 08:28:39.382 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-09ce154f-a8dc-447f-9b90-c08e3249d3e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:28:39 compute-0 nova_compute[192567]: 2025-10-02 08:28:39.384 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-09ce154f-a8dc-447f-9b90-c08e3249d3e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:28:39 compute-0 nova_compute[192567]: 2025-10-02 08:28:39.384 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:28:39 compute-0 nova_compute[192567]: 2025-10-02 08:28:39.385 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 09ce154f-a8dc-447f-9b90-c08e3249d3e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:28:40 compute-0 nova_compute[192567]: 2025-10-02 08:28:40.878 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Updating instance_info_cache with network_info: [{"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:28:40 compute-0 nova_compute[192567]: 2025-10-02 08:28:40.916 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-09ce154f-a8dc-447f-9b90-c08e3249d3e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:28:40 compute-0 nova_compute[192567]: 2025-10-02 08:28:40.918 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:28:40 compute-0 nova_compute[192567]: 2025-10-02 08:28:40.919 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:28:40 compute-0 nova_compute[192567]: 2025-10-02 08:28:40.919 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:28:40 compute-0 nova_compute[192567]: 2025-10-02 08:28:40.920 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:28:40 compute-0 nova_compute[192567]: 2025-10-02 08:28:40.944 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:28:40 compute-0 nova_compute[192567]: 2025-10-02 08:28:40.945 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:28:40 compute-0 nova_compute[192567]: 2025-10-02 08:28:40.945 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:28:40 compute-0 nova_compute[192567]: 2025-10-02 08:28:40.946 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.027 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.116 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.117 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.188 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.356 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.358 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.43624496459961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.358 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.358 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.440 2 INFO nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance 43e0143b-27e0-44e4-a638-c33d49573e91 has allocations against this compute host but is not found in the database.
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.440 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.440 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.481 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.495 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.526 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:28:41 compute-0 nova_compute[192567]: 2025-10-02 08:28:41.527 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:28:42 compute-0 podman[222742]: 2025-10-02 08:28:42.179081493 +0000 UTC m=+0.087703863 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Oct 02 08:28:42 compute-0 podman[222750]: 2025-10-02 08:28:42.199302465 +0000 UTC m=+0.086923258 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 02 08:28:42 compute-0 podman[222743]: 2025-10-02 08:28:42.220011793 +0000 UTC m=+0.120552543 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:28:42 compute-0 podman[222744]: 2025-10-02 08:28:42.240542465 +0000 UTC m=+0.129802948 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 02 08:28:42 compute-0 nova_compute[192567]: 2025-10-02 08:28:42.522 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:28:42 compute-0 nova_compute[192567]: 2025-10-02 08:28:42.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:42 compute-0 nova_compute[192567]: 2025-10-02 08:28:42.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:28:42 compute-0 nova_compute[192567]: 2025-10-02 08:28:42.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:42 compute-0 nova_compute[192567]: 2025-10-02 08:28:42.702 2 DEBUG nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Check if temp file /var/lib/nova/instances/tmp0jfgpvlp exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Oct 02 08:28:42 compute-0 nova_compute[192567]: 2025-10-02 08:28:42.703 2 DEBUG nova.compute.manager [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0jfgpvlp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09ce154f-a8dc-447f-9b90-c08e3249d3e3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Oct 02 08:28:43 compute-0 nova_compute[192567]: 2025-10-02 08:28:43.436 2 DEBUG oslo_concurrency.processutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:28:43 compute-0 nova_compute[192567]: 2025-10-02 08:28:43.508 2 DEBUG oslo_concurrency.processutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:28:43 compute-0 nova_compute[192567]: 2025-10-02 08:28:43.509 2 DEBUG oslo_concurrency.processutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:28:43 compute-0 nova_compute[192567]: 2025-10-02 08:28:43.568 2 DEBUG oslo_concurrency.processutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:28:45 compute-0 nova_compute[192567]: 2025-10-02 08:28:45.628 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:28:45 compute-0 sshd-session[222830]: Accepted publickey for nova from 192.168.122.101 port 59164 ssh2: ECDSA SHA256:nyj9easCU2+zJyxXdAvgdE/0ePVxCLkFf7X2/rv3WZg
Oct 02 08:28:45 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Oct 02 08:28:45 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct 02 08:28:45 compute-0 systemd-logind[827]: New session 41 of user nova.
Oct 02 08:28:45 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct 02 08:28:45 compute-0 systemd[1]: Starting User Manager for UID 42436...
Oct 02 08:28:45 compute-0 systemd[222834]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:28:45 compute-0 systemd[222834]: Queued start job for default target Main User Target.
Oct 02 08:28:45 compute-0 systemd[222834]: Created slice User Application Slice.
Oct 02 08:28:45 compute-0 systemd[222834]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 02 08:28:45 compute-0 systemd[222834]: Started Daily Cleanup of User's Temporary Directories.
Oct 02 08:28:45 compute-0 systemd[222834]: Reached target Paths.
Oct 02 08:28:45 compute-0 systemd[222834]: Reached target Timers.
Oct 02 08:28:45 compute-0 systemd[222834]: Starting D-Bus User Message Bus Socket...
Oct 02 08:28:45 compute-0 systemd[222834]: Starting Create User's Volatile Files and Directories...
Oct 02 08:28:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:45.988 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:28:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:45.989 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:28:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:45.990 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:28:46 compute-0 systemd[222834]: Listening on D-Bus User Message Bus Socket.
Oct 02 08:28:46 compute-0 systemd[222834]: Reached target Sockets.
Oct 02 08:28:46 compute-0 systemd[222834]: Finished Create User's Volatile Files and Directories.
Oct 02 08:28:46 compute-0 systemd[222834]: Reached target Basic System.
Oct 02 08:28:46 compute-0 systemd[222834]: Reached target Main User Target.
Oct 02 08:28:46 compute-0 systemd[222834]: Startup finished in 198ms.
Oct 02 08:28:46 compute-0 systemd[1]: Started User Manager for UID 42436.
Oct 02 08:28:46 compute-0 systemd[1]: Started Session 41 of User nova.
Oct 02 08:28:46 compute-0 sshd-session[222830]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 02 08:28:46 compute-0 sshd-session[222850]: Received disconnect from 192.168.122.101 port 59164:11: disconnected by user
Oct 02 08:28:46 compute-0 sshd-session[222850]: Disconnected from user nova 192.168.122.101 port 59164
Oct 02 08:28:46 compute-0 sshd-session[222830]: pam_unix(sshd:session): session closed for user nova
Oct 02 08:28:46 compute-0 systemd[1]: session-41.scope: Deactivated successfully.
Oct 02 08:28:46 compute-0 systemd-logind[827]: Session 41 logged out. Waiting for processes to exit.
Oct 02 08:28:46 compute-0 systemd-logind[827]: Removed session 41.
Oct 02 08:28:46 compute-0 nova_compute[192567]: 2025-10-02 08:28:46.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:28:46 compute-0 nova_compute[192567]: 2025-10-02 08:28:46.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:28:46 compute-0 nova_compute[192567]: 2025-10-02 08:28:46.732 2 DEBUG nova.compute.manager [req-96b942a8-7852-4fb6-8d29-8d0e79b9ddb6 req-af96db3e-b84f-4166-9bae-cb940f3c3208 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-unplugged-45c35630-cf9e-45fc-b081-384e2a1425de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:28:46 compute-0 nova_compute[192567]: 2025-10-02 08:28:46.733 2 DEBUG oslo_concurrency.lockutils [req-96b942a8-7852-4fb6-8d29-8d0e79b9ddb6 req-af96db3e-b84f-4166-9bae-cb940f3c3208 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:28:46 compute-0 nova_compute[192567]: 2025-10-02 08:28:46.734 2 DEBUG oslo_concurrency.lockutils [req-96b942a8-7852-4fb6-8d29-8d0e79b9ddb6 req-af96db3e-b84f-4166-9bae-cb940f3c3208 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:28:46 compute-0 nova_compute[192567]: 2025-10-02 08:28:46.734 2 DEBUG oslo_concurrency.lockutils [req-96b942a8-7852-4fb6-8d29-8d0e79b9ddb6 req-af96db3e-b84f-4166-9bae-cb940f3c3208 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:28:46 compute-0 nova_compute[192567]: 2025-10-02 08:28:46.735 2 DEBUG nova.compute.manager [req-96b942a8-7852-4fb6-8d29-8d0e79b9ddb6 req-af96db3e-b84f-4166-9bae-cb940f3c3208 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] No waiting events found dispatching network-vif-unplugged-45c35630-cf9e-45fc-b081-384e2a1425de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:28:46 compute-0 nova_compute[192567]: 2025-10-02 08:28:46.735 2 DEBUG nova.compute.manager [req-96b942a8-7852-4fb6-8d29-8d0e79b9ddb6 req-af96db3e-b84f-4166-9bae-cb940f3c3208 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-unplugged-45c35630-cf9e-45fc-b081-384e2a1425de for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:28:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:47.326 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:28:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:47.327 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:28:47 compute-0 nova_compute[192567]: 2025-10-02 08:28:47.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:47 compute-0 nova_compute[192567]: 2025-10-02 08:28:47.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:47 compute-0 nova_compute[192567]: 2025-10-02 08:28:47.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:48 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 02 08:28:48 compute-0 podman[222853]: 2025-10-02 08:28:48.186447835 +0000 UTC m=+0.096971288 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.203 2 INFO nova.compute.manager [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Took 4.63 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.205 2 DEBUG nova.compute.manager [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.220 2 DEBUG nova.compute.manager [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0jfgpvlp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='09ce154f-a8dc-447f-9b90-c08e3249d3e3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(43e0143b-27e0-44e4-a638-c33d49573e91),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.247 2 DEBUG nova.objects.instance [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 09ce154f-a8dc-447f-9b90-c08e3249d3e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.248 2 DEBUG nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.251 2 DEBUG nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.251 2 DEBUG nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.266 2 DEBUG nova.virt.libvirt.vif [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:27:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-384391495',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-384391495',id=20,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:27:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-7nf09cwv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:27:54Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=09ce154f-a8dc-447f-9b90-c08e3249d3e3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.267 2 DEBUG nova.network.os_vif_util [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.268 2 DEBUG nova.network.os_vif_util [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:38:b8,bridge_name='br-int',has_traffic_filtering=True,id=45c35630-cf9e-45fc-b081-384e2a1425de,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45c35630-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.268 2 DEBUG nova.virt.libvirt.migration [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Updating guest XML with vif config: <interface type="ethernet">
Oct 02 08:28:48 compute-0 nova_compute[192567]:   <mac address="fa:16:3e:68:38:b8"/>
Oct 02 08:28:48 compute-0 nova_compute[192567]:   <model type="virtio"/>
Oct 02 08:28:48 compute-0 nova_compute[192567]:   <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:28:48 compute-0 nova_compute[192567]:   <mtu size="1442"/>
Oct 02 08:28:48 compute-0 nova_compute[192567]:   <target dev="tap45c35630-cf"/>
Oct 02 08:28:48 compute-0 nova_compute[192567]: </interface>
Oct 02 08:28:48 compute-0 nova_compute[192567]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.269 2 DEBUG nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Oct 02 08:28:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:48.330 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.755 2 DEBUG nova.virt.libvirt.migration [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.756 2 INFO nova.virt.libvirt.migration [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Increasing downtime to 50 ms after 0 sec elapsed time
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.861 2 DEBUG nova.compute.manager [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.861 2 DEBUG oslo_concurrency.lockutils [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.861 2 DEBUG oslo_concurrency.lockutils [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.862 2 DEBUG oslo_concurrency.lockutils [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.862 2 DEBUG nova.compute.manager [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] No waiting events found dispatching network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.862 2 WARNING nova.compute.manager [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received unexpected event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de for instance with vm_state active and task_state migrating.
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.862 2 DEBUG nova.compute.manager [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-changed-45c35630-cf9e-45fc-b081-384e2a1425de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.863 2 DEBUG nova.compute.manager [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Refreshing instance network info cache due to event network-changed-45c35630-cf9e-45fc-b081-384e2a1425de. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.863 2 DEBUG oslo_concurrency.lockutils [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-09ce154f-a8dc-447f-9b90-c08e3249d3e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.863 2 DEBUG oslo_concurrency.lockutils [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-09ce154f-a8dc-447f-9b90-c08e3249d3e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.863 2 DEBUG nova.network.neutron [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Refreshing network info cache for port 45c35630-cf9e-45fc-b081-384e2a1425de _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:28:48 compute-0 nova_compute[192567]: 2025-10-02 08:28:48.866 2 INFO nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Oct 02 08:28:49 compute-0 nova_compute[192567]: 2025-10-02 08:28:49.369 2 DEBUG nova.virt.libvirt.migration [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:28:49 compute-0 nova_compute[192567]: 2025-10-02 08:28:49.369 2 DEBUG nova.virt.libvirt.migration [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:28:49 compute-0 nova_compute[192567]: 2025-10-02 08:28:49.874 2 DEBUG nova.virt.libvirt.migration [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:28:49 compute-0 nova_compute[192567]: 2025-10-02 08:28:49.874 2 DEBUG nova.virt.libvirt.migration [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:28:50 compute-0 nova_compute[192567]: 2025-10-02 08:28:50.219 2 DEBUG nova.network.neutron [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Updated VIF entry in instance network info cache for port 45c35630-cf9e-45fc-b081-384e2a1425de. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:28:50 compute-0 nova_compute[192567]: 2025-10-02 08:28:50.220 2 DEBUG nova.network.neutron [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Updating instance_info_cache with network_info: [{"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:28:50 compute-0 nova_compute[192567]: 2025-10-02 08:28:50.245 2 DEBUG oslo_concurrency.lockutils [req-d87e9132-541d-4327-afc6-796adf8740b3 req-851c8dc9-1477-42b8-b08b-92b94a52c6f8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-09ce154f-a8dc-447f-9b90-c08e3249d3e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:28:50 compute-0 nova_compute[192567]: 2025-10-02 08:28:50.379 2 DEBUG nova.virt.libvirt.migration [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:28:50 compute-0 nova_compute[192567]: 2025-10-02 08:28:50.379 2 DEBUG nova.virt.libvirt.migration [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:28:50 compute-0 nova_compute[192567]: 2025-10-02 08:28:50.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:28:50 compute-0 nova_compute[192567]: 2025-10-02 08:28:50.883 2 DEBUG nova.virt.libvirt.migration [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Oct 02 08:28:50 compute-0 nova_compute[192567]: 2025-10-02 08:28:50.884 2 DEBUG nova.virt.libvirt.migration [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.115 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393731.1145465, 09ce154f-a8dc-447f-9b90-c08e3249d3e3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.115 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] VM Paused (Lifecycle Event)
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.139 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.145 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.170 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] During sync_power_state the instance has a pending task (migrating). Skip.
Oct 02 08:28:51 compute-0 kernel: tap45c35630-cf (unregistering): left promiscuous mode
Oct 02 08:28:51 compute-0 NetworkManager[51654]: <info>  [1759393731.2791] device (tap45c35630-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:51 compute-0 ovn_controller[94821]: 2025-10-02T08:28:51Z|00167|binding|INFO|Releasing lport 45c35630-cf9e-45fc-b081-384e2a1425de from this chassis (sb_readonly=0)
Oct 02 08:28:51 compute-0 ovn_controller[94821]: 2025-10-02T08:28:51Z|00168|binding|INFO|Setting lport 45c35630-cf9e-45fc-b081-384e2a1425de down in Southbound
Oct 02 08:28:51 compute-0 ovn_controller[94821]: 2025-10-02T08:28:51Z|00169|binding|INFO|Removing iface tap45c35630-cf ovn-installed in OVS
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.307 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:38:b8 10.100.0.10'], port_security=['fa:16:3e:68:38:b8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '61f597a0-da80-455c-aab0-956a1e15f143'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '09ce154f-a8dc-447f-9b90-c08e3249d3e3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=45c35630-cf9e-45fc-b081-384e2a1425de) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.310 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 45c35630-cf9e-45fc-b081-384e2a1425de in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d unbound from our chassis
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.312 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.314 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[cad7fc61-c621-4f14-85f1-ba71d9a3293f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.315 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d namespace which is not needed anymore
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:51 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000014.scope: Deactivated successfully.
Oct 02 08:28:51 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000014.scope: Consumed 14.575s CPU time.
Oct 02 08:28:51 compute-0 systemd-machined[152597]: Machine qemu-15-instance-00000014 terminated.
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:51 compute-0 sshd-session[222878]: Invalid user admin from 103.148.92.60 port 39447
Oct 02 08:28:51 compute-0 sshd-session[222878]: pam_unix(sshd:auth): check pass; user unknown
Oct 02 08:28:51 compute-0 sshd-session[222878]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.148.92.60
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.522 2 DEBUG nova.compute.manager [req-a10685b7-51dd-4b1e-94ac-549989470ede req-c4041dd7-5f78-4d1f-a705-49e8e11e2a44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-unplugged-45c35630-cf9e-45fc-b081-384e2a1425de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.522 2 DEBUG oslo_concurrency.lockutils [req-a10685b7-51dd-4b1e-94ac-549989470ede req-c4041dd7-5f78-4d1f-a705-49e8e11e2a44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.523 2 DEBUG oslo_concurrency.lockutils [req-a10685b7-51dd-4b1e-94ac-549989470ede req-c4041dd7-5f78-4d1f-a705-49e8e11e2a44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.523 2 DEBUG oslo_concurrency.lockutils [req-a10685b7-51dd-4b1e-94ac-549989470ede req-c4041dd7-5f78-4d1f-a705-49e8e11e2a44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.524 2 DEBUG nova.compute.manager [req-a10685b7-51dd-4b1e-94ac-549989470ede req-c4041dd7-5f78-4d1f-a705-49e8e11e2a44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] No waiting events found dispatching network-vif-unplugged-45c35630-cf9e-45fc-b081-384e2a1425de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.524 2 DEBUG nova.compute.manager [req-a10685b7-51dd-4b1e-94ac-549989470ede req-c4041dd7-5f78-4d1f-a705-49e8e11e2a44 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-unplugged-45c35630-cf9e-45fc-b081-384e2a1425de for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:28:51 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[222562]: [NOTICE]   (222566) : haproxy version is 2.8.14-c23fe91
Oct 02 08:28:51 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[222562]: [NOTICE]   (222566) : path to executable is /usr/sbin/haproxy
Oct 02 08:28:51 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[222562]: [ALERT]    (222566) : Current worker (222568) exited with code 143 (Terminated)
Oct 02 08:28:51 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[222562]: [WARNING]  (222566) : All workers exited. Exiting... (0)
Oct 02 08:28:51 compute-0 systemd[1]: libpod-5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74.scope: Deactivated successfully.
Oct 02 08:28:51 compute-0 podman[222920]: 2025-10-02 08:28:51.544583534 +0000 UTC m=+0.087308301 container died 5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.544 2 DEBUG nova.virt.libvirt.guest [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.544 2 INFO nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Migration operation has completed
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.545 2 INFO nova.compute.manager [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] _post_live_migration() is started..
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.553 2 DEBUG nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.553 2 DEBUG nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.554 2 DEBUG nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Oct 02 08:28:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74-userdata-shm.mount: Deactivated successfully.
Oct 02 08:28:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-0fa7b65d71484909306df798ff586dc4ca7bd5218f2fa75183f16ffffde3e745-merged.mount: Deactivated successfully.
Oct 02 08:28:51 compute-0 podman[222920]: 2025-10-02 08:28:51.601780475 +0000 UTC m=+0.144505292 container cleanup 5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:28:51 compute-0 systemd[1]: libpod-conmon-5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74.scope: Deactivated successfully.
Oct 02 08:28:51 compute-0 podman[222965]: 2025-10-02 08:28:51.705068976 +0000 UTC m=+0.066272362 container remove 5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.714 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[15371cfc-cfea-44f8-98ec-735c563ce125]: (4, ('Thu Oct  2 08:28:51 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d (5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74)\n5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74\nThu Oct  2 08:28:51 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d (5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74)\n5acb15ca62529e927e6cbf98ce8547e7dcbae66e743b468978c0e75b46a38c74\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.717 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bc1ca0-6690-4310-95e0-c1889014829d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.719 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:51 compute-0 kernel: tap08b16a0c-b0: left promiscuous mode
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:51 compute-0 nova_compute[192567]: 2025-10-02 08:28:51.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.763 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[4902e2d7-b9ca-4a3f-a965-46c68b99d3e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.794 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[5390e157-389a-4469-995c-688605c1694a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.796 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[88fab7bf-01db-4986-9fa4-a469cab63c7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.825 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[7036f137-eda9-4a8f-9c0c-bca97a6db72d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456072, 'reachable_time': 21220, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222981, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.828 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:28:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:28:51.828 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[4d6c1e6d-3973-43f7-abd2-d090330de253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:28:51 compute-0 systemd[1]: run-netns-ovnmeta\x2d08b16a0c\x2db69f\x2d4a34\x2d9bfe\x2d830099adfe8d.mount: Deactivated successfully.
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.080 2 DEBUG nova.compute.manager [req-b499309f-d279-451a-ae9f-6e9231654422 req-0a9ccb75-28d4-4580-b613-16c87f67cf92 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-unplugged-45c35630-cf9e-45fc-b081-384e2a1425de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.081 2 DEBUG oslo_concurrency.lockutils [req-b499309f-d279-451a-ae9f-6e9231654422 req-0a9ccb75-28d4-4580-b613-16c87f67cf92 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.082 2 DEBUG oslo_concurrency.lockutils [req-b499309f-d279-451a-ae9f-6e9231654422 req-0a9ccb75-28d4-4580-b613-16c87f67cf92 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.083 2 DEBUG oslo_concurrency.lockutils [req-b499309f-d279-451a-ae9f-6e9231654422 req-0a9ccb75-28d4-4580-b613-16c87f67cf92 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.083 2 DEBUG nova.compute.manager [req-b499309f-d279-451a-ae9f-6e9231654422 req-0a9ccb75-28d4-4580-b613-16c87f67cf92 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] No waiting events found dispatching network-vif-unplugged-45c35630-cf9e-45fc-b081-384e2a1425de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.084 2 DEBUG nova.compute.manager [req-b499309f-d279-451a-ae9f-6e9231654422 req-0a9ccb75-28d4-4580-b613-16c87f67cf92 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-unplugged-45c35630-cf9e-45fc-b081-384e2a1425de for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.336 2 DEBUG nova.network.neutron [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Activated binding for port 45c35630-cf9e-45fc-b081-384e2a1425de and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.337 2 DEBUG nova.compute.manager [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.339 2 DEBUG nova.virt.libvirt.vif [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:27:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-384391495',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-384391495',id=20,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:27:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-7nf09cwv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:28:40Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=09ce154f-a8dc-447f-9b90-c08e3249d3e3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.340 2 DEBUG nova.network.os_vif_util [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "45c35630-cf9e-45fc-b081-384e2a1425de", "address": "fa:16:3e:68:38:b8", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45c35630-cf", "ovs_interfaceid": "45c35630-cf9e-45fc-b081-384e2a1425de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.341 2 DEBUG nova.network.os_vif_util [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:38:b8,bridge_name='br-int',has_traffic_filtering=True,id=45c35630-cf9e-45fc-b081-384e2a1425de,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45c35630-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.341 2 DEBUG os_vif [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:38:b8,bridge_name='br-int',has_traffic_filtering=True,id=45c35630-cf9e-45fc-b081-384e2a1425de,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45c35630-cf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.344 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45c35630-cf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.352 2 INFO os_vif [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:38:b8,bridge_name='br-int',has_traffic_filtering=True,id=45c35630-cf9e-45fc-b081-384e2a1425de,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45c35630-cf')
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.353 2 DEBUG oslo_concurrency.lockutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.353 2 DEBUG oslo_concurrency.lockutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.354 2 DEBUG oslo_concurrency.lockutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.354 2 DEBUG nova.compute.manager [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.355 2 INFO nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Deleting instance files /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3_del
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.356 2 INFO nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Deletion of /var/lib/nova/instances/09ce154f-a8dc-447f-9b90-c08e3249d3e3_del complete
Oct 02 08:28:52 compute-0 nova_compute[192567]: 2025-10-02 08:28:52.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.635 2 DEBUG nova.compute.manager [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.635 2 DEBUG oslo_concurrency.lockutils [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.636 2 DEBUG oslo_concurrency.lockutils [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.637 2 DEBUG oslo_concurrency.lockutils [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.637 2 DEBUG nova.compute.manager [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] No waiting events found dispatching network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.638 2 WARNING nova.compute.manager [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received unexpected event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de for instance with vm_state active and task_state migrating.
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.638 2 DEBUG nova.compute.manager [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.639 2 DEBUG oslo_concurrency.lockutils [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.639 2 DEBUG oslo_concurrency.lockutils [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.639 2 DEBUG oslo_concurrency.lockutils [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.640 2 DEBUG nova.compute.manager [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] No waiting events found dispatching network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.640 2 WARNING nova.compute.manager [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received unexpected event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de for instance with vm_state active and task_state migrating.
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.641 2 DEBUG nova.compute.manager [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.641 2 DEBUG oslo_concurrency.lockutils [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.642 2 DEBUG oslo_concurrency.lockutils [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.642 2 DEBUG oslo_concurrency.lockutils [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.642 2 DEBUG nova.compute.manager [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] No waiting events found dispatching network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.643 2 WARNING nova.compute.manager [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received unexpected event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de for instance with vm_state active and task_state migrating.
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.643 2 DEBUG nova.compute.manager [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.644 2 DEBUG oslo_concurrency.lockutils [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.644 2 DEBUG oslo_concurrency.lockutils [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.645 2 DEBUG oslo_concurrency.lockutils [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.645 2 DEBUG nova.compute.manager [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] No waiting events found dispatching network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:28:53 compute-0 nova_compute[192567]: 2025-10-02 08:28:53.646 2 WARNING nova.compute.manager [req-960048c0-e329-4959-b0d4-8898e29871aa req-46c4ddab-38be-4601-baee-bea1eff39975 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Received unexpected event network-vif-plugged-45c35630-cf9e-45fc-b081-384e2a1425de for instance with vm_state active and task_state migrating.
Oct 02 08:28:53 compute-0 sshd-session[222878]: Failed password for invalid user admin from 103.148.92.60 port 39447 ssh2
Oct 02 08:28:54 compute-0 sshd-session[222878]: Connection closed by invalid user admin 103.148.92.60 port 39447 [preauth]
Oct 02 08:28:56 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Oct 02 08:28:56 compute-0 systemd[222834]: Activating special unit Exit the Session...
Oct 02 08:28:56 compute-0 systemd[222834]: Stopped target Main User Target.
Oct 02 08:28:56 compute-0 systemd[222834]: Stopped target Basic System.
Oct 02 08:28:56 compute-0 systemd[222834]: Stopped target Paths.
Oct 02 08:28:56 compute-0 systemd[222834]: Stopped target Sockets.
Oct 02 08:28:56 compute-0 systemd[222834]: Stopped target Timers.
Oct 02 08:28:56 compute-0 systemd[222834]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 02 08:28:56 compute-0 systemd[222834]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 02 08:28:56 compute-0 systemd[222834]: Closed D-Bus User Message Bus Socket.
Oct 02 08:28:56 compute-0 systemd[222834]: Stopped Create User's Volatile Files and Directories.
Oct 02 08:28:56 compute-0 systemd[222834]: Removed slice User Application Slice.
Oct 02 08:28:56 compute-0 systemd[222834]: Reached target Shutdown.
Oct 02 08:28:56 compute-0 systemd[222834]: Finished Exit the Session.
Oct 02 08:28:56 compute-0 systemd[222834]: Reached target Exit the Session.
Oct 02 08:28:56 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Oct 02 08:28:56 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Oct 02 08:28:56 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct 02 08:28:56 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct 02 08:28:56 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct 02 08:28:56 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct 02 08:28:56 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Oct 02 08:28:57 compute-0 nova_compute[192567]: 2025-10-02 08:28:57.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:57 compute-0 nova_compute[192567]: 2025-10-02 08:28:57.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:28:59 compute-0 podman[203011]: time="2025-10-02T08:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:28:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:28:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.260 2 DEBUG oslo_concurrency.lockutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.260 2 DEBUG oslo_concurrency.lockutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.260 2 DEBUG oslo_concurrency.lockutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "09ce154f-a8dc-447f-9b90-c08e3249d3e3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.292 2 DEBUG oslo_concurrency.lockutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.292 2 DEBUG oslo_concurrency.lockutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.292 2 DEBUG oslo_concurrency.lockutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.293 2 DEBUG nova.compute.resource_tracker [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:29:00 compute-0 podman[222984]: 2025-10-02 08:29:00.452337926 +0000 UTC m=+0.102939601 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7)
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.500 2 WARNING nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.501 2 DEBUG nova.compute.resource_tracker [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5857MB free_disk=73.46520614624023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.501 2 DEBUG oslo_concurrency.lockutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.502 2 DEBUG oslo_concurrency.lockutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.551 2 DEBUG nova.compute.resource_tracker [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Migration for instance 09ce154f-a8dc-447f-9b90-c08e3249d3e3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.589 2 DEBUG nova.compute.resource_tracker [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.635 2 DEBUG nova.compute.resource_tracker [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Migration 43e0143b-27e0-44e4-a638-c33d49573e91 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.637 2 DEBUG nova.compute.resource_tracker [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.637 2 DEBUG nova.compute.resource_tracker [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.679 2 DEBUG nova.compute.provider_tree [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.693 2 DEBUG nova.scheduler.client.report [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.716 2 DEBUG nova.compute.resource_tracker [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.716 2 DEBUG oslo_concurrency.lockutils [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.721 2 INFO nova.compute.manager [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.800 2 INFO nova.scheduler.client.report [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Deleted allocation for migration 43e0143b-27e0-44e4-a638-c33d49573e91
Oct 02 08:29:00 compute-0 nova_compute[192567]: 2025-10-02 08:29:00.800 2 DEBUG nova.virt.libvirt.driver [None req-04042fbe-3d0d-4ee7-ae7b-d456dfce691d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Oct 02 08:29:01 compute-0 openstack_network_exporter[205118]: ERROR   08:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:29:01 compute-0 openstack_network_exporter[205118]: ERROR   08:29:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:29:01 compute-0 openstack_network_exporter[205118]: ERROR   08:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:29:01 compute-0 openstack_network_exporter[205118]: ERROR   08:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:29:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:29:01 compute-0 openstack_network_exporter[205118]: ERROR   08:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:29:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:29:02 compute-0 nova_compute[192567]: 2025-10-02 08:29:02.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:02 compute-0 nova_compute[192567]: 2025-10-02 08:29:02.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:06 compute-0 nova_compute[192567]: 2025-10-02 08:29:06.553 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759393731.5437202, 09ce154f-a8dc-447f-9b90-c08e3249d3e3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:29:06 compute-0 nova_compute[192567]: 2025-10-02 08:29:06.553 2 INFO nova.compute.manager [-] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] VM Stopped (Lifecycle Event)
Oct 02 08:29:06 compute-0 nova_compute[192567]: 2025-10-02 08:29:06.583 2 DEBUG nova.compute.manager [None req-b35582aa-ab24-4c3d-906f-69dbed967bf4 - - - - - -] [instance: 09ce154f-a8dc-447f-9b90-c08e3249d3e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:29:07 compute-0 nova_compute[192567]: 2025-10-02 08:29:07.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:07 compute-0 nova_compute[192567]: 2025-10-02 08:29:07.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:12 compute-0 nova_compute[192567]: 2025-10-02 08:29:12.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:12 compute-0 nova_compute[192567]: 2025-10-02 08:29:12.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:13 compute-0 podman[223005]: 2025-10-02 08:29:13.197468209 +0000 UTC m=+0.100197638 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 02 08:29:13 compute-0 podman[223007]: 2025-10-02 08:29:13.217201087 +0000 UTC m=+0.109499564 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct 02 08:29:13 compute-0 podman[223008]: 2025-10-02 08:29:13.237792341 +0000 UTC m=+0.125010362 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid)
Oct 02 08:29:13 compute-0 podman[223006]: 2025-10-02 08:29:13.259640244 +0000 UTC m=+0.158634367 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:29:17 compute-0 nova_compute[192567]: 2025-10-02 08:29:17.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:17 compute-0 nova_compute[192567]: 2025-10-02 08:29:17.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:19 compute-0 podman[223083]: 2025-10-02 08:29:19.182499014 +0000 UTC m=+0.091589991 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:29:22 compute-0 nova_compute[192567]: 2025-10-02 08:29:22.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:22 compute-0 nova_compute[192567]: 2025-10-02 08:29:22.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:26 compute-0 nova_compute[192567]: 2025-10-02 08:29:26.969 2 DEBUG nova.compute.manager [None req-10d1eb73-e840-4e67-b136-6e98466f82cd 06fd0ba32e344f06ac22f27398df6fab a46cbd7217a541c58391886cae342f44 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Oct 02 08:29:27 compute-0 nova_compute[192567]: 2025-10-02 08:29:27.032 2 DEBUG nova.compute.provider_tree [None req-10d1eb73-e840-4e67-b136-6e98466f82cd 06fd0ba32e344f06ac22f27398df6fab a46cbd7217a541c58391886cae342f44 - - default default] Updating resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e generation from 31 to 34 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 02 08:29:27 compute-0 nova_compute[192567]: 2025-10-02 08:29:27.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:27 compute-0 nova_compute[192567]: 2025-10-02 08:29:27.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:29 compute-0 podman[203011]: time="2025-10-02T08:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:29:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:29:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Oct 02 08:29:31 compute-0 podman[223109]: 2025-10-02 08:29:31.186708036 +0000 UTC m=+0.094935446 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 02 08:29:31 compute-0 openstack_network_exporter[205118]: ERROR   08:29:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:29:31 compute-0 openstack_network_exporter[205118]: ERROR   08:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:29:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:29:31 compute-0 openstack_network_exporter[205118]: ERROR   08:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:29:31 compute-0 openstack_network_exporter[205118]: ERROR   08:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:29:31 compute-0 openstack_network_exporter[205118]: ERROR   08:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:29:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:29:32 compute-0 nova_compute[192567]: 2025-10-02 08:29:32.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:32 compute-0 nova_compute[192567]: 2025-10-02 08:29:32.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:37 compute-0 nova_compute[192567]: 2025-10-02 08:29:37.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:37 compute-0 nova_compute[192567]: 2025-10-02 08:29:37.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:39 compute-0 nova_compute[192567]: 2025-10-02 08:29:39.622 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:29:39 compute-0 nova_compute[192567]: 2025-10-02 08:29:39.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:29:40 compute-0 nova_compute[192567]: 2025-10-02 08:29:40.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:29:40 compute-0 nova_compute[192567]: 2025-10-02 08:29:40.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:29:40 compute-0 nova_compute[192567]: 2025-10-02 08:29:40.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:29:40 compute-0 nova_compute[192567]: 2025-10-02 08:29:40.645 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.662 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.663 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.663 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.664 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.872 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.874 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5865MB free_disk=73.46522521972656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.874 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.874 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.956 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.956 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:29:41 compute-0 nova_compute[192567]: 2025-10-02 08:29:41.995 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:29:42 compute-0 nova_compute[192567]: 2025-10-02 08:29:42.019 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:29:42 compute-0 nova_compute[192567]: 2025-10-02 08:29:42.021 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:29:42 compute-0 nova_compute[192567]: 2025-10-02 08:29:42.021 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:29:42 compute-0 nova_compute[192567]: 2025-10-02 08:29:42.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:42 compute-0 nova_compute[192567]: 2025-10-02 08:29:42.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:44 compute-0 nova_compute[192567]: 2025-10-02 08:29:44.021 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:29:44 compute-0 podman[223131]: 2025-10-02 08:29:44.168317901 +0000 UTC m=+0.082957806 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 02 08:29:44 compute-0 podman[223134]: 2025-10-02 08:29:44.187185332 +0000 UTC m=+0.080271213 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3)
Oct 02 08:29:44 compute-0 podman[223133]: 2025-10-02 08:29:44.193709283 +0000 UTC m=+0.090987643 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Oct 02 08:29:44 compute-0 podman[223132]: 2025-10-02 08:29:44.220415885 +0000 UTC m=+0.130165390 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:29:45 compute-0 nova_compute[192567]: 2025-10-02 08:29:45.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:29:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:29:45.989 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:29:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:29:45.989 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:29:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:29:45.989 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:29:47 compute-0 nova_compute[192567]: 2025-10-02 08:29:47.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:47 compute-0 nova_compute[192567]: 2025-10-02 08:29:47.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:29:47 compute-0 nova_compute[192567]: 2025-10-02 08:29:47.623 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:29:47 compute-0 nova_compute[192567]: 2025-10-02 08:29:47.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:48 compute-0 nova_compute[192567]: 2025-10-02 08:29:48.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:29:50 compute-0 podman[223214]: 2025-10-02 08:29:50.174669053 +0000 UTC m=+0.081851354 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:29:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:29:51.761 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:29:51 compute-0 nova_compute[192567]: 2025-10-02 08:29:51.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:29:51.764 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:29:52 compute-0 nova_compute[192567]: 2025-10-02 08:29:52.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:52 compute-0 nova_compute[192567]: 2025-10-02 08:29:52.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:29:52 compute-0 nova_compute[192567]: 2025-10-02 08:29:52.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:57 compute-0 nova_compute[192567]: 2025-10-02 08:29:57.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:57 compute-0 nova_compute[192567]: 2025-10-02 08:29:57.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:29:59 compute-0 podman[203011]: time="2025-10-02T08:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:29:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:29:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Oct 02 08:30:00 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:30:00.765 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:30:01 compute-0 openstack_network_exporter[205118]: ERROR   08:30:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:30:01 compute-0 openstack_network_exporter[205118]: ERROR   08:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:30:01 compute-0 openstack_network_exporter[205118]: ERROR   08:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:30:01 compute-0 openstack_network_exporter[205118]: ERROR   08:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:30:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:30:01 compute-0 openstack_network_exporter[205118]: ERROR   08:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:30:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:30:02 compute-0 podman[223238]: 2025-10-02 08:30:02.171721665 +0000 UTC m=+0.083983219 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Oct 02 08:30:02 compute-0 nova_compute[192567]: 2025-10-02 08:30:02.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:02 compute-0 nova_compute[192567]: 2025-10-02 08:30:02.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:07 compute-0 nova_compute[192567]: 2025-10-02 08:30:07.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:07 compute-0 nova_compute[192567]: 2025-10-02 08:30:07.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:12 compute-0 nova_compute[192567]: 2025-10-02 08:30:12.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:12 compute-0 nova_compute[192567]: 2025-10-02 08:30:12.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:15 compute-0 podman[223259]: 2025-10-02 08:30:15.193449336 +0000 UTC m=+0.094817706 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct 02 08:30:15 compute-0 podman[223261]: 2025-10-02 08:30:15.218405324 +0000 UTC m=+0.107167621 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd)
Oct 02 08:30:15 compute-0 podman[223262]: 2025-10-02 08:30:15.229980165 +0000 UTC m=+0.113747927 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 02 08:30:15 compute-0 podman[223260]: 2025-10-02 08:30:15.241577746 +0000 UTC m=+0.135701811 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Oct 02 08:30:17 compute-0 nova_compute[192567]: 2025-10-02 08:30:17.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:17 compute-0 nova_compute[192567]: 2025-10-02 08:30:17.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:21 compute-0 podman[223343]: 2025-10-02 08:30:21.172050247 +0000 UTC m=+0.080338126 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:30:21 compute-0 ovn_controller[94821]: 2025-10-02T08:30:21Z|00170|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Oct 02 08:30:22 compute-0 nova_compute[192567]: 2025-10-02 08:30:22.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:22 compute-0 nova_compute[192567]: 2025-10-02 08:30:22.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:27 compute-0 nova_compute[192567]: 2025-10-02 08:30:27.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:27 compute-0 nova_compute[192567]: 2025-10-02 08:30:27.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:29 compute-0 podman[203011]: time="2025-10-02T08:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:30:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:30:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3008 "" "Go-http-client/1.1"
Oct 02 08:30:31 compute-0 openstack_network_exporter[205118]: ERROR   08:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:30:31 compute-0 openstack_network_exporter[205118]: ERROR   08:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:30:31 compute-0 openstack_network_exporter[205118]: ERROR   08:30:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:30:31 compute-0 openstack_network_exporter[205118]: ERROR   08:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:30:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:30:31 compute-0 openstack_network_exporter[205118]: ERROR   08:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:30:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:30:32 compute-0 nova_compute[192567]: 2025-10-02 08:30:32.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:32 compute-0 nova_compute[192567]: 2025-10-02 08:30:32.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:33 compute-0 podman[223368]: 2025-10-02 08:30:33.150162544 +0000 UTC m=+0.060896490 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 02 08:30:37 compute-0 nova_compute[192567]: 2025-10-02 08:30:37.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:37 compute-0 nova_compute[192567]: 2025-10-02 08:30:37.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:39 compute-0 nova_compute[192567]: 2025-10-02 08:30:39.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:30:41 compute-0 nova_compute[192567]: 2025-10-02 08:30:41.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:30:41 compute-0 nova_compute[192567]: 2025-10-02 08:30:41.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:30:42 compute-0 nova_compute[192567]: 2025-10-02 08:30:42.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:30:42 compute-0 nova_compute[192567]: 2025-10-02 08:30:42.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:30:42 compute-0 nova_compute[192567]: 2025-10-02 08:30:42.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:30:42 compute-0 nova_compute[192567]: 2025-10-02 08:30:42.636 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:30:42 compute-0 nova_compute[192567]: 2025-10-02 08:30:42.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:30:42 compute-0 nova_compute[192567]: 2025-10-02 08:30:42.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:30:42 compute-0 nova_compute[192567]: 2025-10-02 08:30:42.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 02 08:30:42 compute-0 nova_compute[192567]: 2025-10-02 08:30:42.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:30:42 compute-0 nova_compute[192567]: 2025-10-02 08:30:42.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:42 compute-0 nova_compute[192567]: 2025-10-02 08:30:42.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.648 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.649 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.649 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.649 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.837 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.838 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5882MB free_disk=73.4652099609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.838 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.838 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.915 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.916 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.942 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing inventories for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.966 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating ProviderTree inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.966 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:30:43 compute-0 nova_compute[192567]: 2025-10-02 08:30:43.984 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing aggregate associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 02 08:30:44 compute-0 nova_compute[192567]: 2025-10-02 08:30:44.005 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing trait associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 02 08:30:44 compute-0 nova_compute[192567]: 2025-10-02 08:30:44.041 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:30:44 compute-0 nova_compute[192567]: 2025-10-02 08:30:44.056 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:30:44 compute-0 nova_compute[192567]: 2025-10-02 08:30:44.059 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:30:44 compute-0 nova_compute[192567]: 2025-10-02 08:30:44.059 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:30:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:30:45.989 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:30:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:30:45.990 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:30:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:30:45.990 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:30:46 compute-0 podman[223389]: 2025-10-02 08:30:46.174517438 +0000 UTC m=+0.084345640 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 02 08:30:46 compute-0 podman[223394]: 2025-10-02 08:30:46.206770533 +0000 UTC m=+0.095355873 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:30:46 compute-0 podman[223397]: 2025-10-02 08:30:46.207263288 +0000 UTC m=+0.093879706 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:30:46 compute-0 podman[223390]: 2025-10-02 08:30:46.222445072 +0000 UTC m=+0.116027708 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 02 08:30:47 compute-0 nova_compute[192567]: 2025-10-02 08:30:47.060 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:30:47 compute-0 nova_compute[192567]: 2025-10-02 08:30:47.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:30:47 compute-0 nova_compute[192567]: 2025-10-02 08:30:47.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:30:47 compute-0 nova_compute[192567]: 2025-10-02 08:30:47.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:30:47 compute-0 nova_compute[192567]: 2025-10-02 08:30:47.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:30:47 compute-0 nova_compute[192567]: 2025-10-02 08:30:47.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 02 08:30:47 compute-0 nova_compute[192567]: 2025-10-02 08:30:47.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:30:47 compute-0 nova_compute[192567]: 2025-10-02 08:30:47.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:47 compute-0 nova_compute[192567]: 2025-10-02 08:30:47.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:30:50 compute-0 nova_compute[192567]: 2025-10-02 08:30:50.739 2 DEBUG nova.virt.libvirt.driver [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Creating tmpfile /var/lib/nova/instances/tmp6jrbwaun to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Oct 02 08:30:50 compute-0 nova_compute[192567]: 2025-10-02 08:30:50.741 2 DEBUG nova.compute.manager [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6jrbwaun',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Oct 02 08:30:50 compute-0 nova_compute[192567]: 2025-10-02 08:30:50.751 2 DEBUG nova.virt.libvirt.driver [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Creating tmpfile /var/lib/nova/instances/tmpds9wrrjs to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Oct 02 08:30:50 compute-0 nova_compute[192567]: 2025-10-02 08:30:50.752 2 DEBUG nova.compute.manager [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpds9wrrjs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Oct 02 08:30:51 compute-0 nova_compute[192567]: 2025-10-02 08:30:51.929 2 DEBUG nova.compute.manager [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6jrbwaun',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='730536ae-5a6b-4165-b40b-412e4afc0180',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Oct 02 08:30:51 compute-0 nova_compute[192567]: 2025-10-02 08:30:51.972 2 DEBUG oslo_concurrency.lockutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-730536ae-5a6b-4165-b40b-412e4afc0180" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:30:51 compute-0 nova_compute[192567]: 2025-10-02 08:30:51.973 2 DEBUG oslo_concurrency.lockutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-730536ae-5a6b-4165-b40b-412e4afc0180" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:30:51 compute-0 nova_compute[192567]: 2025-10-02 08:30:51.973 2 DEBUG nova.network.neutron [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:30:52 compute-0 podman[223471]: 2025-10-02 08:30:52.196703169 +0000 UTC m=+0.095121866 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.952 2 DEBUG nova.network.neutron [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Updating instance_info_cache with network_info: [{"id": "2d466f0d-d549-40f3-8d0a-a971202ed70d", "address": "fa:16:3e:0b:87:71", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d466f0d-d5", "ovs_interfaceid": "2d466f0d-d549-40f3-8d0a-a971202ed70d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.970 2 DEBUG oslo_concurrency.lockutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-730536ae-5a6b-4165-b40b-412e4afc0180" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.973 2 DEBUG nova.virt.libvirt.driver [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6jrbwaun',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='730536ae-5a6b-4165-b40b-412e4afc0180',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.974 2 DEBUG nova.virt.libvirt.driver [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Creating instance directory: /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.974 2 DEBUG nova.virt.libvirt.driver [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Creating disk.info with the contents: {'/var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk': 'qcow2', '/var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.975 2 DEBUG nova.virt.libvirt.driver [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Oct 02 08:30:52 compute-0 nova_compute[192567]: 2025-10-02 08:30:52.976 2 DEBUG nova.objects.instance [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 730536ae-5a6b-4165-b40b-412e4afc0180 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.024 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.113 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.114 2 DEBUG oslo_concurrency.lockutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.115 2 DEBUG oslo_concurrency.lockutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.129 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.208 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.210 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.265 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.267 2 DEBUG oslo_concurrency.lockutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.268 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.359 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.361 2 DEBUG nova.virt.disk.api [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.361 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.454 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.456 2 DEBUG nova.virt.disk.api [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.457 2 DEBUG nova.objects.instance [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 730536ae-5a6b-4165-b40b-412e4afc0180 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.485 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.526 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk.config 485376" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.528 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk.config to /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.529 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk.config /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:30:53 compute-0 nova_compute[192567]: 2025-10-02 08:30:53.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.134 2 DEBUG oslo_concurrency.processutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180/disk.config /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180" returned: 0 in 0.605s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.136 2 DEBUG nova.virt.libvirt.driver [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.138 2 DEBUG nova.virt.libvirt.vif [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:29:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-269880737',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-269880737',id=21,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:29:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-rp5mkqay',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:29:41Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=730536ae-5a6b-4165-b40b-412e4afc0180,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2d466f0d-d549-40f3-8d0a-a971202ed70d", "address": "fa:16:3e:0b:87:71", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2d466f0d-d5", "ovs_interfaceid": "2d466f0d-d549-40f3-8d0a-a971202ed70d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.138 2 DEBUG nova.network.os_vif_util [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "2d466f0d-d549-40f3-8d0a-a971202ed70d", "address": "fa:16:3e:0b:87:71", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2d466f0d-d5", "ovs_interfaceid": "2d466f0d-d549-40f3-8d0a-a971202ed70d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.140 2 DEBUG nova.network.os_vif_util [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0b:87:71,bridge_name='br-int',has_traffic_filtering=True,id=2d466f0d-d549-40f3-8d0a-a971202ed70d,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d466f0d-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.141 2 DEBUG os_vif [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:87:71,bridge_name='br-int',has_traffic_filtering=True,id=2d466f0d-d549-40f3-8d0a-a971202ed70d,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d466f0d-d5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.143 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.144 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.150 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2d466f0d-d5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.151 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2d466f0d-d5, col_values=(('external_ids', {'iface-id': '2d466f0d-d549-40f3-8d0a-a971202ed70d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0b:87:71', 'vm-uuid': '730536ae-5a6b-4165-b40b-412e4afc0180'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:54 compute-0 NetworkManager[51654]: <info>  [1759393854.1550] manager: (tap2d466f0d-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.163 2 INFO os_vif [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:87:71,bridge_name='br-int',has_traffic_filtering=True,id=2d466f0d-d549-40f3-8d0a-a971202ed70d,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d466f0d-d5')
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.164 2 DEBUG nova.virt.libvirt.driver [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Oct 02 08:30:54 compute-0 nova_compute[192567]: 2025-10-02 08:30:54.165 2 DEBUG nova.compute.manager [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6jrbwaun',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='730536ae-5a6b-4165-b40b-412e4afc0180',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Oct 02 08:30:57 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:30:57.346 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:30:57 compute-0 nova_compute[192567]: 2025-10-02 08:30:57.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:57 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:30:57.348 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:30:57 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:30:57.349 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:30:57 compute-0 nova_compute[192567]: 2025-10-02 08:30:57.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:58 compute-0 nova_compute[192567]: 2025-10-02 08:30:58.295 2 DEBUG nova.network.neutron [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Port 2d466f0d-d549-40f3-8d0a-a971202ed70d updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Oct 02 08:30:58 compute-0 nova_compute[192567]: 2025-10-02 08:30:58.299 2 DEBUG nova.compute.manager [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp6jrbwaun',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='730536ae-5a6b-4165-b40b-412e4afc0180',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Oct 02 08:30:58 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 02 08:30:58 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 02 08:30:58 compute-0 kernel: tap2d466f0d-d5: entered promiscuous mode
Oct 02 08:30:58 compute-0 ovn_controller[94821]: 2025-10-02T08:30:58Z|00171|binding|INFO|Claiming lport 2d466f0d-d549-40f3-8d0a-a971202ed70d for this additional chassis.
Oct 02 08:30:58 compute-0 ovn_controller[94821]: 2025-10-02T08:30:58Z|00172|binding|INFO|2d466f0d-d549-40f3-8d0a-a971202ed70d: Claiming fa:16:3e:0b:87:71 10.100.0.9
Oct 02 08:30:58 compute-0 nova_compute[192567]: 2025-10-02 08:30:58.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:58 compute-0 NetworkManager[51654]: <info>  [1759393858.6600] manager: (tap2d466f0d-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Oct 02 08:30:58 compute-0 ovn_controller[94821]: 2025-10-02T08:30:58Z|00173|binding|INFO|Setting lport 2d466f0d-d549-40f3-8d0a-a971202ed70d ovn-installed in OVS
Oct 02 08:30:58 compute-0 nova_compute[192567]: 2025-10-02 08:30:58.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:58 compute-0 nova_compute[192567]: 2025-10-02 08:30:58.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:58 compute-0 systemd-machined[152597]: New machine qemu-16-instance-00000015.
Oct 02 08:30:58 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000015.
Oct 02 08:30:58 compute-0 systemd-udevd[223550]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:30:58 compute-0 NetworkManager[51654]: <info>  [1759393858.7523] device (tap2d466f0d-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:30:58 compute-0 NetworkManager[51654]: <info>  [1759393858.7534] device (tap2d466f0d-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:30:59 compute-0 nova_compute[192567]: 2025-10-02 08:30:59.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:30:59 compute-0 podman[203011]: time="2025-10-02T08:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:30:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:30:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Oct 02 08:30:59 compute-0 nova_compute[192567]: 2025-10-02 08:30:59.842 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393859.8418803, 730536ae-5a6b-4165-b40b-412e4afc0180 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:30:59 compute-0 nova_compute[192567]: 2025-10-02 08:30:59.843 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] VM Started (Lifecycle Event)
Oct 02 08:30:59 compute-0 nova_compute[192567]: 2025-10-02 08:30:59.862 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:31:00 compute-0 nova_compute[192567]: 2025-10-02 08:31:00.693 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393860.6935525, 730536ae-5a6b-4165-b40b-412e4afc0180 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:31:00 compute-0 nova_compute[192567]: 2025-10-02 08:31:00.694 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] VM Resumed (Lifecycle Event)
Oct 02 08:31:00 compute-0 nova_compute[192567]: 2025-10-02 08:31:00.715 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:31:00 compute-0 nova_compute[192567]: 2025-10-02 08:31:00.718 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:31:00 compute-0 nova_compute[192567]: 2025-10-02 08:31:00.741 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Oct 02 08:31:01 compute-0 openstack_network_exporter[205118]: ERROR   08:31:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:31:01 compute-0 openstack_network_exporter[205118]: ERROR   08:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:31:01 compute-0 openstack_network_exporter[205118]: ERROR   08:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:31:01 compute-0 openstack_network_exporter[205118]: ERROR   08:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:31:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:31:01 compute-0 openstack_network_exporter[205118]: ERROR   08:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:31:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:31:01 compute-0 ovn_controller[94821]: 2025-10-02T08:31:01Z|00174|binding|INFO|Claiming lport 2d466f0d-d549-40f3-8d0a-a971202ed70d for this chassis.
Oct 02 08:31:01 compute-0 ovn_controller[94821]: 2025-10-02T08:31:01Z|00175|binding|INFO|2d466f0d-d549-40f3-8d0a-a971202ed70d: Claiming fa:16:3e:0b:87:71 10.100.0.9
Oct 02 08:31:01 compute-0 ovn_controller[94821]: 2025-10-02T08:31:01Z|00176|binding|INFO|Setting lport 2d466f0d-d549-40f3-8d0a-a971202ed70d up in Southbound
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.584 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:87:71 10.100.0.9'], port_security=['fa:16:3e:0b:87:71 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '730536ae-5a6b-4165-b40b-412e4afc0180', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=2d466f0d-d549-40f3-8d0a-a971202ed70d) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.585 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 2d466f0d-d549-40f3-8d0a-a971202ed70d in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d bound to our chassis
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.586 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.606 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ee7790-124e-4cb3-9d4d-70d19b2503a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.607 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap08b16a0c-b1 in ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.611 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap08b16a0c-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.612 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[4c149749-7e1d-48d7-aafd-1b680b442344]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.613 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[acce9de2-3098-499f-8484-43d6185eb394]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.628 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[7895f400-fc70-494b-97a7-b0a3f92477fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.663 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[210065f3-3aea-46f9-b76d-540cc13af2fb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.714 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[c70cdbe8-50d2-4170-8f74-29c70cc0aad1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.722 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[34f8a5a1-7f8c-4c21-a591-23a0d26ebc3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 NetworkManager[51654]: <info>  [1759393861.7235] manager: (tap08b16a0c-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Oct 02 08:31:01 compute-0 nova_compute[192567]: 2025-10-02 08:31:01.768 2 INFO nova.compute.manager [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Post operation of migration started
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.776 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[e559626c-c541-417a-ad77-c7f7d77cc290]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 systemd-udevd[223588]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.782 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[c63ccbe4-a9c1-4a5c-a38d-b0d9df49fa91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 NetworkManager[51654]: <info>  [1759393861.8293] device (tap08b16a0c-b0): carrier: link connected
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.834 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[10889e89-df62-4938-9922-5f169c5a3b32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.857 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[3b687f22-076a-4e59-9d1c-17c8cd7f5994]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474941, 'reachable_time': 41515, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223607, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.876 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[d68e767f-0ee2-4650-8e83-47b64d132397]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:c53f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474941, 'tstamp': 474941}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223608, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.898 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc7ef1a-4df7-47f2-9cd9-5f425027151b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474941, 'reachable_time': 41515, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223609, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:01.949 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[98a03bd7-24e8-4284-9681-b33fabb22899]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:02 compute-0 nova_compute[192567]: 2025-10-02 08:31:02.041 2 DEBUG oslo_concurrency.lockutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-730536ae-5a6b-4165-b40b-412e4afc0180" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:31:02 compute-0 nova_compute[192567]: 2025-10-02 08:31:02.041 2 DEBUG oslo_concurrency.lockutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-730536ae-5a6b-4165-b40b-412e4afc0180" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:31:02 compute-0 nova_compute[192567]: 2025-10-02 08:31:02.042 2 DEBUG nova.network.neutron [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:02.056 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[25175f16-ea2d-43ab-b79e-efdc382b720a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:02.058 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:02.059 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:02.059 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b16a0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:02 compute-0 nova_compute[192567]: 2025-10-02 08:31:02.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:02 compute-0 NetworkManager[51654]: <info>  [1759393862.0629] manager: (tap08b16a0c-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Oct 02 08:31:02 compute-0 kernel: tap08b16a0c-b0: entered promiscuous mode
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:02.069 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08b16a0c-b0, col_values=(('external_ids', {'iface-id': '748eef31-77a8-4b04-b6b7-dc0f7cc1cf65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:02 compute-0 nova_compute[192567]: 2025-10-02 08:31:02.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:02 compute-0 ovn_controller[94821]: 2025-10-02T08:31:02Z|00177|binding|INFO|Releasing lport 748eef31-77a8-4b04-b6b7-dc0f7cc1cf65 from this chassis (sb_readonly=0)
Oct 02 08:31:02 compute-0 nova_compute[192567]: 2025-10-02 08:31:02.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:02.073 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:02.074 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[41e68258-68ae-4f89-b9af-92e92d924a81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:02.075 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:31:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:02.076 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'env', 'PROCESS_TAG=haproxy-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/08b16a0c-b69f-4a34-9bfe-830099adfe8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:31:02 compute-0 nova_compute[192567]: 2025-10-02 08:31:02.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:02 compute-0 podman[223641]: 2025-10-02 08:31:02.519722188 +0000 UTC m=+0.062310804 container create 3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 02 08:31:02 compute-0 podman[223641]: 2025-10-02 08:31:02.487833934 +0000 UTC m=+0.030422530 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:31:02 compute-0 systemd[1]: Started libpod-conmon-3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a.scope.
Oct 02 08:31:02 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:31:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ba91c5aaff165c6a0ab8b58b41edb52d26083ad1fbded79babfac9fc9f62018/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:31:02 compute-0 podman[223641]: 2025-10-02 08:31:02.639418659 +0000 UTC m=+0.182007275 container init 3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001)
Oct 02 08:31:02 compute-0 podman[223641]: 2025-10-02 08:31:02.649939196 +0000 UTC m=+0.192527782 container start 3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 02 08:31:02 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[223656]: [NOTICE]   (223660) : New worker (223662) forked
Oct 02 08:31:02 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[223656]: [NOTICE]   (223660) : Loading success.
Oct 02 08:31:02 compute-0 nova_compute[192567]: 2025-10-02 08:31:02.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:03 compute-0 nova_compute[192567]: 2025-10-02 08:31:03.081 2 DEBUG nova.network.neutron [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Updating instance_info_cache with network_info: [{"id": "2d466f0d-d549-40f3-8d0a-a971202ed70d", "address": "fa:16:3e:0b:87:71", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d466f0d-d5", "ovs_interfaceid": "2d466f0d-d549-40f3-8d0a-a971202ed70d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:31:03 compute-0 nova_compute[192567]: 2025-10-02 08:31:03.109 2 DEBUG oslo_concurrency.lockutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-730536ae-5a6b-4165-b40b-412e4afc0180" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:31:03 compute-0 nova_compute[192567]: 2025-10-02 08:31:03.129 2 DEBUG oslo_concurrency.lockutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:03 compute-0 nova_compute[192567]: 2025-10-02 08:31:03.130 2 DEBUG oslo_concurrency.lockutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:03 compute-0 nova_compute[192567]: 2025-10-02 08:31:03.131 2 DEBUG oslo_concurrency.lockutils [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:03 compute-0 nova_compute[192567]: 2025-10-02 08:31:03.137 2 INFO nova.virt.libvirt.driver [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 02 08:31:03 compute-0 virtqemud[192112]: Domain id=16 name='instance-00000015' uuid=730536ae-5a6b-4165-b40b-412e4afc0180 is tainted: custom-monitor
Oct 02 08:31:04 compute-0 podman[223671]: 2025-10-02 08:31:04.144778218 +0000 UTC m=+0.066139272 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 02 08:31:04 compute-0 nova_compute[192567]: 2025-10-02 08:31:04.147 2 INFO nova.virt.libvirt.driver [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 02 08:31:04 compute-0 nova_compute[192567]: 2025-10-02 08:31:04.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:05 compute-0 nova_compute[192567]: 2025-10-02 08:31:05.153 2 INFO nova.virt.libvirt.driver [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 02 08:31:05 compute-0 nova_compute[192567]: 2025-10-02 08:31:05.159 2 DEBUG nova.compute.manager [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:31:05 compute-0 nova_compute[192567]: 2025-10-02 08:31:05.184 2 DEBUG nova.objects.instance [None req-c3031133-720a-4ba7-8236-66ebb4565b11 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Oct 02 08:31:07 compute-0 nova_compute[192567]: 2025-10-02 08:31:07.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:08 compute-0 nova_compute[192567]: 2025-10-02 08:31:08.587 2 DEBUG nova.compute.manager [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpds9wrrjs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f47dec78-bacb-4384-88ef-d1a8d1a68902',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Oct 02 08:31:08 compute-0 nova_compute[192567]: 2025-10-02 08:31:08.630 2 DEBUG oslo_concurrency.lockutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-f47dec78-bacb-4384-88ef-d1a8d1a68902" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:31:08 compute-0 nova_compute[192567]: 2025-10-02 08:31:08.630 2 DEBUG oslo_concurrency.lockutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-f47dec78-bacb-4384-88ef-d1a8d1a68902" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:31:08 compute-0 nova_compute[192567]: 2025-10-02 08:31:08.630 2 DEBUG nova.network.neutron [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:31:09 compute-0 nova_compute[192567]: 2025-10-02 08:31:09.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.492 2 DEBUG nova.network.neutron [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Updating instance_info_cache with network_info: [{"id": "667ca661-ba0f-4cc5-b5f9-71fc491134bb", "address": "fa:16:3e:84:5c:db", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap667ca661-ba", "ovs_interfaceid": "667ca661-ba0f-4cc5-b5f9-71fc491134bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.522 2 DEBUG oslo_concurrency.lockutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-f47dec78-bacb-4384-88ef-d1a8d1a68902" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.526 2 DEBUG nova.virt.libvirt.driver [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpds9wrrjs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f47dec78-bacb-4384-88ef-d1a8d1a68902',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.527 2 DEBUG nova.virt.libvirt.driver [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Creating instance directory: /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.528 2 DEBUG nova.virt.libvirt.driver [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Creating disk.info with the contents: {'/var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk': 'qcow2', '/var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.529 2 DEBUG nova.virt.libvirt.driver [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.531 2 DEBUG nova.objects.instance [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f47dec78-bacb-4384-88ef-d1a8d1a68902 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.575 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.665 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.668 2 DEBUG oslo_concurrency.lockutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.669 2 DEBUG oslo_concurrency.lockutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.694 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.774 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.775 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.825 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.827 2 DEBUG oslo_concurrency.lockutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.828 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.918 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.920 2 DEBUG nova.virt.disk.api [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.921 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.994 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.996 2 DEBUG nova.virt.disk.api [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:31:11 compute-0 nova_compute[192567]: 2025-10-02 08:31:11.997 2 DEBUG nova.objects.instance [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid f47dec78-bacb-4384-88ef-d1a8d1a68902 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.014 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.046 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk.config 485376" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.049 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk.config to /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.049 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk.config /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.634 2 DEBUG oslo_concurrency.processutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902/disk.config /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902" returned: 0 in 0.585s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.635 2 DEBUG nova.virt.libvirt.driver [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.636 2 DEBUG nova.virt.libvirt.vif [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1093136401',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1093136401',id=22,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:30:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-w2ojwunm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:30:02Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=f47dec78-bacb-4384-88ef-d1a8d1a68902,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "667ca661-ba0f-4cc5-b5f9-71fc491134bb", "address": "fa:16:3e:84:5c:db", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap667ca661-ba", "ovs_interfaceid": "667ca661-ba0f-4cc5-b5f9-71fc491134bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.637 2 DEBUG nova.network.os_vif_util [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "667ca661-ba0f-4cc5-b5f9-71fc491134bb", "address": "fa:16:3e:84:5c:db", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap667ca661-ba", "ovs_interfaceid": "667ca661-ba0f-4cc5-b5f9-71fc491134bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.638 2 DEBUG nova.network.os_vif_util [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:5c:db,bridge_name='br-int',has_traffic_filtering=True,id=667ca661-ba0f-4cc5-b5f9-71fc491134bb,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap667ca661-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.638 2 DEBUG os_vif [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:5c:db,bridge_name='br-int',has_traffic_filtering=True,id=667ca661-ba0f-4cc5-b5f9-71fc491134bb,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap667ca661-ba') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.640 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.640 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.643 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap667ca661-ba, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.644 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap667ca661-ba, col_values=(('external_ids', {'iface-id': '667ca661-ba0f-4cc5-b5f9-71fc491134bb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:5c:db', 'vm-uuid': 'f47dec78-bacb-4384-88ef-d1a8d1a68902'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:12 compute-0 NetworkManager[51654]: <info>  [1759393872.6479] manager: (tap667ca661-ba): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.657 2 INFO os_vif [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:5c:db,bridge_name='br-int',has_traffic_filtering=True,id=667ca661-ba0f-4cc5-b5f9-71fc491134bb,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap667ca661-ba')
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.657 2 DEBUG nova.virt.libvirt.driver [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.658 2 DEBUG nova.compute.manager [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpds9wrrjs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f47dec78-bacb-4384-88ef-d1a8d1a68902',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Oct 02 08:31:12 compute-0 nova_compute[192567]: 2025-10-02 08:31:12.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:13 compute-0 nova_compute[192567]: 2025-10-02 08:31:13.735 2 DEBUG nova.network.neutron [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Port 667ca661-ba0f-4cc5-b5f9-71fc491134bb updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Oct 02 08:31:13 compute-0 nova_compute[192567]: 2025-10-02 08:31:13.738 2 DEBUG nova.compute.manager [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpds9wrrjs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f47dec78-bacb-4384-88ef-d1a8d1a68902',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Oct 02 08:31:14 compute-0 kernel: tap667ca661-ba: entered promiscuous mode
Oct 02 08:31:14 compute-0 NetworkManager[51654]: <info>  [1759393874.0576] manager: (tap667ca661-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Oct 02 08:31:14 compute-0 ovn_controller[94821]: 2025-10-02T08:31:14Z|00178|binding|INFO|Claiming lport 667ca661-ba0f-4cc5-b5f9-71fc491134bb for this additional chassis.
Oct 02 08:31:14 compute-0 ovn_controller[94821]: 2025-10-02T08:31:14Z|00179|binding|INFO|667ca661-ba0f-4cc5-b5f9-71fc491134bb: Claiming fa:16:3e:84:5c:db 10.100.0.13
Oct 02 08:31:14 compute-0 nova_compute[192567]: 2025-10-02 08:31:14.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:14 compute-0 ovn_controller[94821]: 2025-10-02T08:31:14Z|00180|binding|INFO|Setting lport 667ca661-ba0f-4cc5-b5f9-71fc491134bb ovn-installed in OVS
Oct 02 08:31:14 compute-0 nova_compute[192567]: 2025-10-02 08:31:14.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:14 compute-0 nova_compute[192567]: 2025-10-02 08:31:14.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:14 compute-0 systemd-udevd[223731]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:31:14 compute-0 systemd-machined[152597]: New machine qemu-17-instance-00000016.
Oct 02 08:31:14 compute-0 NetworkManager[51654]: <info>  [1759393874.1328] device (tap667ca661-ba): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:31:14 compute-0 NetworkManager[51654]: <info>  [1759393874.1337] device (tap667ca661-ba): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:31:14 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000016.
Oct 02 08:31:15 compute-0 nova_compute[192567]: 2025-10-02 08:31:15.406 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393875.4060905, f47dec78-bacb-4384-88ef-d1a8d1a68902 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:31:15 compute-0 nova_compute[192567]: 2025-10-02 08:31:15.407 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] VM Started (Lifecycle Event)
Oct 02 08:31:15 compute-0 nova_compute[192567]: 2025-10-02 08:31:15.439 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:31:16 compute-0 nova_compute[192567]: 2025-10-02 08:31:16.308 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759393876.3083923, f47dec78-bacb-4384-88ef-d1a8d1a68902 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:31:16 compute-0 nova_compute[192567]: 2025-10-02 08:31:16.309 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] VM Resumed (Lifecycle Event)
Oct 02 08:31:16 compute-0 nova_compute[192567]: 2025-10-02 08:31:16.325 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:31:16 compute-0 nova_compute[192567]: 2025-10-02 08:31:16.327 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:31:16 compute-0 nova_compute[192567]: 2025-10-02 08:31:16.346 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Oct 02 08:31:17 compute-0 podman[223763]: 2025-10-02 08:31:17.24635777 +0000 UTC m=+0.101606177 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:31:17 compute-0 podman[223764]: 2025-10-02 08:31:17.246571277 +0000 UTC m=+0.096203520 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:31:17 compute-0 podman[223761]: 2025-10-02 08:31:17.274767056 +0000 UTC m=+0.136979360 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 02 08:31:17 compute-0 podman[223762]: 2025-10-02 08:31:17.284666134 +0000 UTC m=+0.144265167 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 02 08:31:17 compute-0 nova_compute[192567]: 2025-10-02 08:31:17.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:17 compute-0 nova_compute[192567]: 2025-10-02 08:31:17.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:18 compute-0 ovn_controller[94821]: 2025-10-02T08:31:18Z|00181|binding|INFO|Claiming lport 667ca661-ba0f-4cc5-b5f9-71fc491134bb for this chassis.
Oct 02 08:31:18 compute-0 ovn_controller[94821]: 2025-10-02T08:31:18Z|00182|binding|INFO|667ca661-ba0f-4cc5-b5f9-71fc491134bb: Claiming fa:16:3e:84:5c:db 10.100.0.13
Oct 02 08:31:18 compute-0 ovn_controller[94821]: 2025-10-02T08:31:18Z|00183|binding|INFO|Setting lport 667ca661-ba0f-4cc5-b5f9-71fc491134bb up in Southbound
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.238 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:5c:db 10.100.0.13'], port_security=['fa:16:3e:84:5c:db 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f47dec78-bacb-4384-88ef-d1a8d1a68902', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=667ca661-ba0f-4cc5-b5f9-71fc491134bb) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.240 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 667ca661-ba0f-4cc5-b5f9-71fc491134bb in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d bound to our chassis
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.243 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.269 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[bc1b0091-ee45-430c-a66c-c7e617ea27a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.314 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[77c70413-c4f2-4432-98e0-df60716bd83c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.318 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[1d15644e-2d82-4d2a-8709-6890f6e9e65e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.358 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[8c9576c2-ab66-4034-a4a7-f01f14fcd1fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.383 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[68fc35df-baf9-4714-836c-729a44651511]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474941, 'reachable_time': 41515, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223847, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.412 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[ad97850c-2b1c-4733-b8fe-bcad955bda69]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap08b16a0c-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474958, 'tstamp': 474958}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223848, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap08b16a0c-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474963, 'tstamp': 474963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223848, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.414 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:18 compute-0 nova_compute[192567]: 2025-10-02 08:31:18.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.420 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b16a0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.420 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.421 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08b16a0c-b0, col_values=(('external_ids', {'iface-id': '748eef31-77a8-4b04-b6b7-dc0f7cc1cf65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:18 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:18.421 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:31:18 compute-0 nova_compute[192567]: 2025-10-02 08:31:18.479 2 INFO nova.compute.manager [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Post operation of migration started
Oct 02 08:31:19 compute-0 nova_compute[192567]: 2025-10-02 08:31:19.252 2 DEBUG oslo_concurrency.lockutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-f47dec78-bacb-4384-88ef-d1a8d1a68902" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:31:19 compute-0 nova_compute[192567]: 2025-10-02 08:31:19.252 2 DEBUG oslo_concurrency.lockutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-f47dec78-bacb-4384-88ef-d1a8d1a68902" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:31:19 compute-0 nova_compute[192567]: 2025-10-02 08:31:19.252 2 DEBUG nova.network.neutron [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:31:21 compute-0 nova_compute[192567]: 2025-10-02 08:31:21.277 2 DEBUG nova.network.neutron [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Updating instance_info_cache with network_info: [{"id": "667ca661-ba0f-4cc5-b5f9-71fc491134bb", "address": "fa:16:3e:84:5c:db", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap667ca661-ba", "ovs_interfaceid": "667ca661-ba0f-4cc5-b5f9-71fc491134bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:31:21 compute-0 nova_compute[192567]: 2025-10-02 08:31:21.300 2 DEBUG oslo_concurrency.lockutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-f47dec78-bacb-4384-88ef-d1a8d1a68902" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:31:21 compute-0 nova_compute[192567]: 2025-10-02 08:31:21.324 2 DEBUG oslo_concurrency.lockutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:21 compute-0 nova_compute[192567]: 2025-10-02 08:31:21.324 2 DEBUG oslo_concurrency.lockutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:21 compute-0 nova_compute[192567]: 2025-10-02 08:31:21.325 2 DEBUG oslo_concurrency.lockutils [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:21 compute-0 nova_compute[192567]: 2025-10-02 08:31:21.330 2 INFO nova.virt.libvirt.driver [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 02 08:31:21 compute-0 virtqemud[192112]: Domain id=17 name='instance-00000016' uuid=f47dec78-bacb-4384-88ef-d1a8d1a68902 is tainted: custom-monitor
Oct 02 08:31:22 compute-0 nova_compute[192567]: 2025-10-02 08:31:22.338 2 INFO nova.virt.libvirt.driver [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 02 08:31:22 compute-0 nova_compute[192567]: 2025-10-02 08:31:22.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:22 compute-0 nova_compute[192567]: 2025-10-02 08:31:22.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:23 compute-0 podman[223850]: 2025-10-02 08:31:23.172382343 +0000 UTC m=+0.078857578 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:31:23 compute-0 nova_compute[192567]: 2025-10-02 08:31:23.345 2 INFO nova.virt.libvirt.driver [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 02 08:31:23 compute-0 nova_compute[192567]: 2025-10-02 08:31:23.351 2 DEBUG nova.compute.manager [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:31:23 compute-0 nova_compute[192567]: 2025-10-02 08:31:23.382 2 DEBUG nova.objects.instance [None req-cc9661ab-248a-4f87-bd0d-ab8faa7f89c8 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Oct 02 08:31:27 compute-0 nova_compute[192567]: 2025-10-02 08:31:27.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:27 compute-0 nova_compute[192567]: 2025-10-02 08:31:27.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.540 2 DEBUG oslo_concurrency.lockutils [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "f47dec78-bacb-4384-88ef-d1a8d1a68902" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.541 2 DEBUG oslo_concurrency.lockutils [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "f47dec78-bacb-4384-88ef-d1a8d1a68902" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.542 2 DEBUG oslo_concurrency.lockutils [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "f47dec78-bacb-4384-88ef-d1a8d1a68902-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.543 2 DEBUG oslo_concurrency.lockutils [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "f47dec78-bacb-4384-88ef-d1a8d1a68902-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.543 2 DEBUG oslo_concurrency.lockutils [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "f47dec78-bacb-4384-88ef-d1a8d1a68902-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.545 2 INFO nova.compute.manager [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Terminating instance
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.547 2 DEBUG nova.compute.manager [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:31:28 compute-0 kernel: tap667ca661-ba (unregistering): left promiscuous mode
Oct 02 08:31:28 compute-0 NetworkManager[51654]: <info>  [1759393888.5810] device (tap667ca661-ba): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:31:28 compute-0 ovn_controller[94821]: 2025-10-02T08:31:28Z|00184|binding|INFO|Releasing lport 667ca661-ba0f-4cc5-b5f9-71fc491134bb from this chassis (sb_readonly=0)
Oct 02 08:31:28 compute-0 ovn_controller[94821]: 2025-10-02T08:31:28Z|00185|binding|INFO|Setting lport 667ca661-ba0f-4cc5-b5f9-71fc491134bb down in Southbound
Oct 02 08:31:28 compute-0 ovn_controller[94821]: 2025-10-02T08:31:28Z|00186|binding|INFO|Removing iface tap667ca661-ba ovn-installed in OVS
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.603 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:5c:db 10.100.0.13'], port_security=['fa:16:3e:84:5c:db 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f47dec78-bacb-4384-88ef-d1a8d1a68902', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=667ca661-ba0f-4cc5-b5f9-71fc491134bb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.606 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 667ca661-ba0f-4cc5-b5f9-71fc491134bb in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d unbound from our chassis
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.610 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.633 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[5fff8ffd-7726-4923-8ffb-fe7264ee0afa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:28 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000016.scope: Deactivated successfully.
Oct 02 08:31:28 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000016.scope: Consumed 2.459s CPU time.
Oct 02 08:31:28 compute-0 systemd-machined[152597]: Machine qemu-17-instance-00000016 terminated.
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.670 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[b396aa03-fc93-4ff3-b4b1-048432004808]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.673 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0ee4f1-7c8e-47d4-8f2a-6862b93e3d42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.707 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[6c8c9dcb-9b0e-4ad8-ba14-1a45c6daac0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.727 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[6a6bfc44-9d59-43cb-be02-eacf9e7eccfc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 43, 'tx_packets': 7, 'rx_bytes': 2302, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 43, 'tx_packets': 7, 'rx_bytes': 2302, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474941, 'reachable_time': 41515, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223886, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.746 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8b3983d6-8b06-4cb9-a6ff-b39467cb1aee]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap08b16a0c-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474958, 'tstamp': 474958}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223887, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap08b16a0c-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474963, 'tstamp': 474963}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223887, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.747 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.812 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b16a0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.812 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.813 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08b16a0c-b0, col_values=(('external_ids', {'iface-id': '748eef31-77a8-4b04-b6b7-dc0f7cc1cf65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:28.814 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.857 2 INFO nova.virt.libvirt.driver [-] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Instance destroyed successfully.
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.858 2 DEBUG nova.objects.instance [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'resources' on Instance uuid f47dec78-bacb-4384-88ef-d1a8d1a68902 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.875 2 DEBUG nova.virt.libvirt.vif [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T08:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1093136401',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1093136401',id=22,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:30:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-w2ojwunm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',clean_attempts='1',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:31:23Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=f47dec78-bacb-4384-88ef-d1a8d1a68902,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "667ca661-ba0f-4cc5-b5f9-71fc491134bb", "address": "fa:16:3e:84:5c:db", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap667ca661-ba", "ovs_interfaceid": "667ca661-ba0f-4cc5-b5f9-71fc491134bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.875 2 DEBUG nova.network.os_vif_util [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "667ca661-ba0f-4cc5-b5f9-71fc491134bb", "address": "fa:16:3e:84:5c:db", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap667ca661-ba", "ovs_interfaceid": "667ca661-ba0f-4cc5-b5f9-71fc491134bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.876 2 DEBUG nova.network.os_vif_util [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:84:5c:db,bridge_name='br-int',has_traffic_filtering=True,id=667ca661-ba0f-4cc5-b5f9-71fc491134bb,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap667ca661-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.877 2 DEBUG os_vif [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:5c:db,bridge_name='br-int',has_traffic_filtering=True,id=667ca661-ba0f-4cc5-b5f9-71fc491134bb,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap667ca661-ba') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.880 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap667ca661-ba, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.887 2 INFO os_vif [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:5c:db,bridge_name='br-int',has_traffic_filtering=True,id=667ca661-ba0f-4cc5-b5f9-71fc491134bb,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap667ca661-ba')
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.887 2 INFO nova.virt.libvirt.driver [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Deleting instance files /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902_del
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.888 2 INFO nova.virt.libvirt.driver [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Deletion of /var/lib/nova/instances/f47dec78-bacb-4384-88ef-d1a8d1a68902_del complete
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.950 2 INFO nova.compute.manager [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Took 0.40 seconds to destroy the instance on the hypervisor.
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.951 2 DEBUG oslo.service.loopingcall [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.951 2 DEBUG nova.compute.manager [-] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:31:28 compute-0 nova_compute[192567]: 2025-10-02 08:31:28.951 2 DEBUG nova.network.neutron [-] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:31:29 compute-0 nova_compute[192567]: 2025-10-02 08:31:29.635 2 DEBUG nova.compute.manager [req-6d5e3cc7-14c2-4fe6-9b2b-e00be44cb341 req-092777b0-aa9c-4bde-9d6f-ba04ef3764f9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Received event network-vif-unplugged-667ca661-ba0f-4cc5-b5f9-71fc491134bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:31:29 compute-0 nova_compute[192567]: 2025-10-02 08:31:29.636 2 DEBUG oslo_concurrency.lockutils [req-6d5e3cc7-14c2-4fe6-9b2b-e00be44cb341 req-092777b0-aa9c-4bde-9d6f-ba04ef3764f9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "f47dec78-bacb-4384-88ef-d1a8d1a68902-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:29 compute-0 nova_compute[192567]: 2025-10-02 08:31:29.636 2 DEBUG oslo_concurrency.lockutils [req-6d5e3cc7-14c2-4fe6-9b2b-e00be44cb341 req-092777b0-aa9c-4bde-9d6f-ba04ef3764f9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f47dec78-bacb-4384-88ef-d1a8d1a68902-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:29 compute-0 nova_compute[192567]: 2025-10-02 08:31:29.636 2 DEBUG oslo_concurrency.lockutils [req-6d5e3cc7-14c2-4fe6-9b2b-e00be44cb341 req-092777b0-aa9c-4bde-9d6f-ba04ef3764f9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f47dec78-bacb-4384-88ef-d1a8d1a68902-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:29 compute-0 nova_compute[192567]: 2025-10-02 08:31:29.637 2 DEBUG nova.compute.manager [req-6d5e3cc7-14c2-4fe6-9b2b-e00be44cb341 req-092777b0-aa9c-4bde-9d6f-ba04ef3764f9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] No waiting events found dispatching network-vif-unplugged-667ca661-ba0f-4cc5-b5f9-71fc491134bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:31:29 compute-0 nova_compute[192567]: 2025-10-02 08:31:29.637 2 DEBUG nova.compute.manager [req-6d5e3cc7-14c2-4fe6-9b2b-e00be44cb341 req-092777b0-aa9c-4bde-9d6f-ba04ef3764f9 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Received event network-vif-unplugged-667ca661-ba0f-4cc5-b5f9-71fc491134bb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:31:29 compute-0 podman[203011]: time="2025-10-02T08:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:31:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20779 "" "Go-http-client/1.1"
Oct 02 08:31:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3478 "" "Go-http-client/1.1"
Oct 02 08:31:30 compute-0 nova_compute[192567]: 2025-10-02 08:31:30.280 2 DEBUG nova.network.neutron [-] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:31:30 compute-0 nova_compute[192567]: 2025-10-02 08:31:30.319 2 INFO nova.compute.manager [-] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Took 1.37 seconds to deallocate network for instance.
Oct 02 08:31:30 compute-0 nova_compute[192567]: 2025-10-02 08:31:30.383 2 DEBUG nova.compute.manager [req-af1ce601-6255-44ed-b20c-29ab8572708d req-9240665a-45af-4144-a0b0-14d0e35b480b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Received event network-vif-deleted-667ca661-ba0f-4cc5-b5f9-71fc491134bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:31:30 compute-0 nova_compute[192567]: 2025-10-02 08:31:30.388 2 DEBUG oslo_concurrency.lockutils [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:30 compute-0 nova_compute[192567]: 2025-10-02 08:31:30.388 2 DEBUG oslo_concurrency.lockutils [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:30 compute-0 nova_compute[192567]: 2025-10-02 08:31:30.394 2 DEBUG oslo_concurrency.lockutils [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:30 compute-0 nova_compute[192567]: 2025-10-02 08:31:30.428 2 INFO nova.scheduler.client.report [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Deleted allocations for instance f47dec78-bacb-4384-88ef-d1a8d1a68902
Oct 02 08:31:30 compute-0 nova_compute[192567]: 2025-10-02 08:31:30.510 2 DEBUG oslo_concurrency.lockutils [None req-f3286566-a133-40a4-8777-46bfd1c681c9 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "f47dec78-bacb-4384-88ef-d1a8d1a68902" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.969s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:31 compute-0 openstack_network_exporter[205118]: ERROR   08:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:31:31 compute-0 openstack_network_exporter[205118]: ERROR   08:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:31:31 compute-0 openstack_network_exporter[205118]: ERROR   08:31:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:31:31 compute-0 openstack_network_exporter[205118]: ERROR   08:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:31:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:31:31 compute-0 openstack_network_exporter[205118]: ERROR   08:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:31:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.645 2 DEBUG oslo_concurrency.lockutils [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "730536ae-5a6b-4165-b40b-412e4afc0180" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.646 2 DEBUG oslo_concurrency.lockutils [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "730536ae-5a6b-4165-b40b-412e4afc0180" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.647 2 DEBUG oslo_concurrency.lockutils [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "730536ae-5a6b-4165-b40b-412e4afc0180-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.647 2 DEBUG oslo_concurrency.lockutils [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "730536ae-5a6b-4165-b40b-412e4afc0180-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.648 2 DEBUG oslo_concurrency.lockutils [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "730536ae-5a6b-4165-b40b-412e4afc0180-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.650 2 INFO nova.compute.manager [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Terminating instance
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.652 2 DEBUG nova.compute.manager [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:31:31 compute-0 kernel: tap2d466f0d-d5 (unregistering): left promiscuous mode
Oct 02 08:31:31 compute-0 NetworkManager[51654]: <info>  [1759393891.6828] device (tap2d466f0d-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:31 compute-0 ovn_controller[94821]: 2025-10-02T08:31:31Z|00187|binding|INFO|Releasing lport 2d466f0d-d549-40f3-8d0a-a971202ed70d from this chassis (sb_readonly=0)
Oct 02 08:31:31 compute-0 ovn_controller[94821]: 2025-10-02T08:31:31Z|00188|binding|INFO|Setting lport 2d466f0d-d549-40f3-8d0a-a971202ed70d down in Southbound
Oct 02 08:31:31 compute-0 ovn_controller[94821]: 2025-10-02T08:31:31Z|00189|binding|INFO|Removing iface tap2d466f0d-d5 ovn-installed in OVS
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:31 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:31.705 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:87:71 10.100.0.9'], port_security=['fa:16:3e:0b:87:71 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '730536ae-5a6b-4165-b40b-412e4afc0180', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=2d466f0d-d549-40f3-8d0a-a971202ed70d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:31:31 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:31.707 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 2d466f0d-d549-40f3-8d0a-a971202ed70d in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d unbound from our chassis
Oct 02 08:31:31 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:31.708 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:31:31 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:31.709 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[35dfef3f-1aaa-4f5c-a95d-a234efcf3d18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:31 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:31.710 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d namespace which is not needed anymore
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.732 2 DEBUG nova.compute.manager [req-ce1a0956-527d-4334-985c-1861c91bd7c2 req-84dd6c94-e867-4d68-8a42-6f2558647546 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Received event network-vif-plugged-667ca661-ba0f-4cc5-b5f9-71fc491134bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.733 2 DEBUG oslo_concurrency.lockutils [req-ce1a0956-527d-4334-985c-1861c91bd7c2 req-84dd6c94-e867-4d68-8a42-6f2558647546 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "f47dec78-bacb-4384-88ef-d1a8d1a68902-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.733 2 DEBUG oslo_concurrency.lockutils [req-ce1a0956-527d-4334-985c-1861c91bd7c2 req-84dd6c94-e867-4d68-8a42-6f2558647546 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f47dec78-bacb-4384-88ef-d1a8d1a68902-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.733 2 DEBUG oslo_concurrency.lockutils [req-ce1a0956-527d-4334-985c-1861c91bd7c2 req-84dd6c94-e867-4d68-8a42-6f2558647546 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "f47dec78-bacb-4384-88ef-d1a8d1a68902-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.734 2 DEBUG nova.compute.manager [req-ce1a0956-527d-4334-985c-1861c91bd7c2 req-84dd6c94-e867-4d68-8a42-6f2558647546 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] No waiting events found dispatching network-vif-plugged-667ca661-ba0f-4cc5-b5f9-71fc491134bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.734 2 WARNING nova.compute.manager [req-ce1a0956-527d-4334-985c-1861c91bd7c2 req-84dd6c94-e867-4d68-8a42-6f2558647546 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Received unexpected event network-vif-plugged-667ca661-ba0f-4cc5-b5f9-71fc491134bb for instance with vm_state deleted and task_state None.
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:31 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Deactivated successfully.
Oct 02 08:31:31 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Consumed 3.276s CPU time.
Oct 02 08:31:31 compute-0 systemd-machined[152597]: Machine qemu-16-instance-00000015 terminated.
Oct 02 08:31:31 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[223656]: [NOTICE]   (223660) : haproxy version is 2.8.14-c23fe91
Oct 02 08:31:31 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[223656]: [NOTICE]   (223660) : path to executable is /usr/sbin/haproxy
Oct 02 08:31:31 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[223656]: [WARNING]  (223660) : Exiting Master process...
Oct 02 08:31:31 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[223656]: [ALERT]    (223660) : Current worker (223662) exited with code 143 (Terminated)
Oct 02 08:31:31 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[223656]: [WARNING]  (223660) : All workers exited. Exiting... (0)
Oct 02 08:31:31 compute-0 systemd[1]: libpod-3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a.scope: Deactivated successfully.
Oct 02 08:31:31 compute-0 podman[223932]: 2025-10-02 08:31:31.937929339 +0000 UTC m=+0.070519428 container died 3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.954 2 INFO nova.virt.libvirt.driver [-] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Instance destroyed successfully.
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.955 2 DEBUG nova.objects.instance [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'resources' on Instance uuid 730536ae-5a6b-4165-b40b-412e4afc0180 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.973 2 DEBUG nova.virt.libvirt.vif [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T08:29:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-269880737',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-269880737',id=21,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:29:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-rp5mkqay',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',clean_attempts='1',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:31:05Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=730536ae-5a6b-4165-b40b-412e4afc0180,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2d466f0d-d549-40f3-8d0a-a971202ed70d", "address": "fa:16:3e:0b:87:71", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d466f0d-d5", "ovs_interfaceid": "2d466f0d-d549-40f3-8d0a-a971202ed70d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.974 2 DEBUG nova.network.os_vif_util [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "2d466f0d-d549-40f3-8d0a-a971202ed70d", "address": "fa:16:3e:0b:87:71", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d466f0d-d5", "ovs_interfaceid": "2d466f0d-d549-40f3-8d0a-a971202ed70d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.975 2 DEBUG nova.network.os_vif_util [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0b:87:71,bridge_name='br-int',has_traffic_filtering=True,id=2d466f0d-d549-40f3-8d0a-a971202ed70d,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d466f0d-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.976 2 DEBUG os_vif [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:87:71,bridge_name='br-int',has_traffic_filtering=True,id=2d466f0d-d549-40f3-8d0a-a971202ed70d,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d466f0d-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.980 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d466f0d-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a-userdata-shm.mount: Deactivated successfully.
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.987 2 INFO os_vif [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:87:71,bridge_name='br-int',has_traffic_filtering=True,id=2d466f0d-d549-40f3-8d0a-a971202ed70d,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d466f0d-d5')
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.988 2 INFO nova.virt.libvirt.driver [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Deleting instance files /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180_del
Oct 02 08:31:31 compute-0 nova_compute[192567]: 2025-10-02 08:31:31.990 2 INFO nova.virt.libvirt.driver [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Deletion of /var/lib/nova/instances/730536ae-5a6b-4165-b40b-412e4afc0180_del complete
Oct 02 08:31:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-9ba91c5aaff165c6a0ab8b58b41edb52d26083ad1fbded79babfac9fc9f62018-merged.mount: Deactivated successfully.
Oct 02 08:31:32 compute-0 podman[223932]: 2025-10-02 08:31:32.004107821 +0000 UTC m=+0.136697900 container cleanup 3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 02 08:31:32 compute-0 systemd[1]: libpod-conmon-3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a.scope: Deactivated successfully.
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.063 2 INFO nova.compute.manager [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Took 0.41 seconds to destroy the instance on the hypervisor.
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.064 2 DEBUG oslo.service.loopingcall [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.065 2 DEBUG nova.compute.manager [-] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.065 2 DEBUG nova.network.neutron [-] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:31:32 compute-0 podman[223978]: 2025-10-02 08:31:32.104973796 +0000 UTC m=+0.065424960 container remove 3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 02 08:31:32 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:32.112 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[526b5378-c8bf-4c54-b802-de44949ac346]: (4, ('Thu Oct  2 08:31:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d (3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a)\n3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a\nThu Oct  2 08:31:32 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d (3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a)\n3a4c12df8d5df3ab6dae6df8cbe8daaa07c1ffbccefc49a58e97ddc2f328af2a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:32 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:32.116 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[38f32bc6-432a-4c68-acfd-8310e090688f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:32 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:32.117 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:32 compute-0 kernel: tap08b16a0c-b0: left promiscuous mode
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:32 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:32.195 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[1c455b39-9789-4c0f-998b-6b24e33f32a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:32 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:32.224 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[79fd2a90-3847-4c59-b87d-555e2323b5f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:32 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:32.226 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[9ba4ca44-604a-4c10-ba60-8011b6b1411f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:32 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:32.247 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e640fe32-089b-4e23-9390-71535bd71137]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474929, 'reachable_time': 41413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223993, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:32 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:32.250 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:31:32 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:32.250 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[cc9f65e9-2560-4a08-b4fa-e2d18c694482]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:31:32 compute-0 systemd[1]: run-netns-ovnmeta\x2d08b16a0c\x2db69f\x2d4a34\x2d9bfe\x2d830099adfe8d.mount: Deactivated successfully.
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.521 2 DEBUG nova.compute.manager [req-0e7c0322-606e-4f89-aff8-2f785ed773ab req-6f2fb0bc-fe05-4125-b7a5-ff840c758113 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Received event network-vif-unplugged-2d466f0d-d549-40f3-8d0a-a971202ed70d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.522 2 DEBUG oslo_concurrency.lockutils [req-0e7c0322-606e-4f89-aff8-2f785ed773ab req-6f2fb0bc-fe05-4125-b7a5-ff840c758113 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "730536ae-5a6b-4165-b40b-412e4afc0180-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.522 2 DEBUG oslo_concurrency.lockutils [req-0e7c0322-606e-4f89-aff8-2f785ed773ab req-6f2fb0bc-fe05-4125-b7a5-ff840c758113 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "730536ae-5a6b-4165-b40b-412e4afc0180-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.522 2 DEBUG oslo_concurrency.lockutils [req-0e7c0322-606e-4f89-aff8-2f785ed773ab req-6f2fb0bc-fe05-4125-b7a5-ff840c758113 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "730536ae-5a6b-4165-b40b-412e4afc0180-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.522 2 DEBUG nova.compute.manager [req-0e7c0322-606e-4f89-aff8-2f785ed773ab req-6f2fb0bc-fe05-4125-b7a5-ff840c758113 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] No waiting events found dispatching network-vif-unplugged-2d466f0d-d549-40f3-8d0a-a971202ed70d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.523 2 DEBUG nova.compute.manager [req-0e7c0322-606e-4f89-aff8-2f785ed773ab req-6f2fb0bc-fe05-4125-b7a5-ff840c758113 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Received event network-vif-unplugged-2d466f0d-d549-40f3-8d0a-a971202ed70d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.716 2 DEBUG nova.network.neutron [-] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.742 2 INFO nova.compute.manager [-] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Took 0.68 seconds to deallocate network for instance.
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.819 2 DEBUG oslo_concurrency.lockutils [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.820 2 DEBUG oslo_concurrency.lockutils [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.829 2 DEBUG oslo_concurrency.lockutils [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.865 2 INFO nova.scheduler.client.report [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Deleted allocations for instance 730536ae-5a6b-4165-b40b-412e4afc0180
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:32 compute-0 nova_compute[192567]: 2025-10-02 08:31:32.947 2 DEBUG oslo_concurrency.lockutils [None req-020cbb52-df8b-49a6-a42c-eab93b774ae1 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "730536ae-5a6b-4165-b40b-412e4afc0180" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:34 compute-0 nova_compute[192567]: 2025-10-02 08:31:34.640 2 DEBUG nova.compute.manager [req-730992d1-9a22-4506-94ef-85ab121c5f00 req-704328d0-01d7-41e9-b59f-f62c6699c745 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Received event network-vif-plugged-2d466f0d-d549-40f3-8d0a-a971202ed70d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:31:34 compute-0 nova_compute[192567]: 2025-10-02 08:31:34.641 2 DEBUG oslo_concurrency.lockutils [req-730992d1-9a22-4506-94ef-85ab121c5f00 req-704328d0-01d7-41e9-b59f-f62c6699c745 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "730536ae-5a6b-4165-b40b-412e4afc0180-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:34 compute-0 nova_compute[192567]: 2025-10-02 08:31:34.641 2 DEBUG oslo_concurrency.lockutils [req-730992d1-9a22-4506-94ef-85ab121c5f00 req-704328d0-01d7-41e9-b59f-f62c6699c745 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "730536ae-5a6b-4165-b40b-412e4afc0180-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:34 compute-0 nova_compute[192567]: 2025-10-02 08:31:34.642 2 DEBUG oslo_concurrency.lockutils [req-730992d1-9a22-4506-94ef-85ab121c5f00 req-704328d0-01d7-41e9-b59f-f62c6699c745 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "730536ae-5a6b-4165-b40b-412e4afc0180-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:34 compute-0 nova_compute[192567]: 2025-10-02 08:31:34.642 2 DEBUG nova.compute.manager [req-730992d1-9a22-4506-94ef-85ab121c5f00 req-704328d0-01d7-41e9-b59f-f62c6699c745 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] No waiting events found dispatching network-vif-plugged-2d466f0d-d549-40f3-8d0a-a971202ed70d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:31:34 compute-0 nova_compute[192567]: 2025-10-02 08:31:34.642 2 WARNING nova.compute.manager [req-730992d1-9a22-4506-94ef-85ab121c5f00 req-704328d0-01d7-41e9-b59f-f62c6699c745 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Received unexpected event network-vif-plugged-2d466f0d-d549-40f3-8d0a-a971202ed70d for instance with vm_state deleted and task_state None.
Oct 02 08:31:34 compute-0 nova_compute[192567]: 2025-10-02 08:31:34.642 2 DEBUG nova.compute.manager [req-730992d1-9a22-4506-94ef-85ab121c5f00 req-704328d0-01d7-41e9-b59f-f62c6699c745 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Received event network-vif-deleted-2d466f0d-d549-40f3-8d0a-a971202ed70d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:31:35 compute-0 podman[223994]: 2025-10-02 08:31:35.196503463 +0000 UTC m=+0.099512403 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Oct 02 08:31:35 compute-0 nova_compute[192567]: 2025-10-02 08:31:35.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:31:35 compute-0 nova_compute[192567]: 2025-10-02 08:31:35.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 02 08:31:35 compute-0 nova_compute[192567]: 2025-10-02 08:31:35.644 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 02 08:31:37 compute-0 nova_compute[192567]: 2025-10-02 08:31:37.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:37 compute-0 nova_compute[192567]: 2025-10-02 08:31:37.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:39 compute-0 nova_compute[192567]: 2025-10-02 08:31:39.640 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:31:42 compute-0 nova_compute[192567]: 2025-10-02 08:31:42.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:42 compute-0 nova_compute[192567]: 2025-10-02 08:31:42.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:31:42 compute-0 nova_compute[192567]: 2025-10-02 08:31:42.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 02 08:31:42 compute-0 nova_compute[192567]: 2025-10-02 08:31:42.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:43 compute-0 nova_compute[192567]: 2025-10-02 08:31:43.650 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:31:43 compute-0 nova_compute[192567]: 2025-10-02 08:31:43.651 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:31:43 compute-0 nova_compute[192567]: 2025-10-02 08:31:43.651 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:31:43 compute-0 nova_compute[192567]: 2025-10-02 08:31:43.673 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:31:43 compute-0 nova_compute[192567]: 2025-10-02 08:31:43.674 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:31:43 compute-0 nova_compute[192567]: 2025-10-02 08:31:43.674 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:31:43 compute-0 nova_compute[192567]: 2025-10-02 08:31:43.855 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759393888.8543546, f47dec78-bacb-4384-88ef-d1a8d1a68902 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:31:43 compute-0 nova_compute[192567]: 2025-10-02 08:31:43.856 2 INFO nova.compute.manager [-] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] VM Stopped (Lifecycle Event)
Oct 02 08:31:43 compute-0 nova_compute[192567]: 2025-10-02 08:31:43.877 2 DEBUG nova.compute.manager [None req-38f2dced-9630-4b03-a222-fcac4ede60e0 - - - - - -] [instance: f47dec78-bacb-4384-88ef-d1a8d1a68902] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:31:45 compute-0 nova_compute[192567]: 2025-10-02 08:31:45.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:31:45 compute-0 nova_compute[192567]: 2025-10-02 08:31:45.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:31:45 compute-0 nova_compute[192567]: 2025-10-02 08:31:45.756 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:45 compute-0 nova_compute[192567]: 2025-10-02 08:31:45.756 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:45 compute-0 nova_compute[192567]: 2025-10-02 08:31:45.757 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:45 compute-0 nova_compute[192567]: 2025-10-02 08:31:45.757 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:31:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:45.991 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:45.992 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:31:45.992 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.016 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.019 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5856MB free_disk=73.46548080444336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.019 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.020 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.100 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.101 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.126 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.142 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.145 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.146 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.949 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759393891.9477258, 730536ae-5a6b-4165-b40b-412e4afc0180 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.950 2 INFO nova.compute.manager [-] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] VM Stopped (Lifecycle Event)
Oct 02 08:31:46 compute-0 nova_compute[192567]: 2025-10-02 08:31:46.980 2 DEBUG nova.compute.manager [None req-c96d7d2c-00c0-44f6-861c-a2b0d065473a - - - - - -] [instance: 730536ae-5a6b-4165-b40b-412e4afc0180] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:31:47 compute-0 nova_compute[192567]: 2025-10-02 08:31:47.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:47 compute-0 nova_compute[192567]: 2025-10-02 08:31:47.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:48 compute-0 nova_compute[192567]: 2025-10-02 08:31:48.145 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:31:48 compute-0 nova_compute[192567]: 2025-10-02 08:31:48.146 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:31:48 compute-0 podman[224017]: 2025-10-02 08:31:48.192861545 +0000 UTC m=+0.098407598 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:31:48 compute-0 podman[224019]: 2025-10-02 08:31:48.197354495 +0000 UTC m=+0.107603405 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 02 08:31:48 compute-0 podman[224020]: 2025-10-02 08:31:48.222351825 +0000 UTC m=+0.118156985 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:31:48 compute-0 podman[224018]: 2025-10-02 08:31:48.222378315 +0000 UTC m=+0.123536472 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Oct 02 08:31:48 compute-0 nova_compute[192567]: 2025-10-02 08:31:48.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:31:49 compute-0 nova_compute[192567]: 2025-10-02 08:31:49.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:31:52 compute-0 nova_compute[192567]: 2025-10-02 08:31:52.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:52 compute-0 nova_compute[192567]: 2025-10-02 08:31:52.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:54 compute-0 podman[224099]: 2025-10-02 08:31:54.178833047 +0000 UTC m=+0.090814233 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:31:55 compute-0 nova_compute[192567]: 2025-10-02 08:31:55.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:31:57 compute-0 nova_compute[192567]: 2025-10-02 08:31:57.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:57 compute-0 nova_compute[192567]: 2025-10-02 08:31:57.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:31:59 compute-0 podman[203011]: time="2025-10-02T08:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:31:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:31:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3013 "" "Go-http-client/1.1"
Oct 02 08:31:59 compute-0 nova_compute[192567]: 2025-10-02 08:31:59.943 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:32:01 compute-0 openstack_network_exporter[205118]: ERROR   08:32:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:32:01 compute-0 openstack_network_exporter[205118]: ERROR   08:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:32:01 compute-0 openstack_network_exporter[205118]: ERROR   08:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:32:01 compute-0 openstack_network_exporter[205118]: ERROR   08:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:32:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:32:01 compute-0 openstack_network_exporter[205118]: ERROR   08:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:32:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:32:02 compute-0 nova_compute[192567]: 2025-10-02 08:32:02.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:02 compute-0 nova_compute[192567]: 2025-10-02 08:32:02.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:32:02 compute-0 ovn_controller[94821]: 2025-10-02T08:32:02Z|00190|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct 02 08:32:02 compute-0 nova_compute[192567]: 2025-10-02 08:32:02.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:06 compute-0 podman[224123]: 2025-10-02 08:32:06.187558908 +0000 UTC m=+0.089636545 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:32:07 compute-0 nova_compute[192567]: 2025-10-02 08:32:07.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:07 compute-0 nova_compute[192567]: 2025-10-02 08:32:07.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:12 compute-0 nova_compute[192567]: 2025-10-02 08:32:12.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:12 compute-0 nova_compute[192567]: 2025-10-02 08:32:12.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:15 compute-0 nova_compute[192567]: 2025-10-02 08:32:15.940 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:32:17 compute-0 nova_compute[192567]: 2025-10-02 08:32:17.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:17 compute-0 nova_compute[192567]: 2025-10-02 08:32:17.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:19 compute-0 podman[224145]: 2025-10-02 08:32:19.180138521 +0000 UTC m=+0.083830914 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 02 08:32:19 compute-0 podman[224147]: 2025-10-02 08:32:19.202098786 +0000 UTC m=+0.094167237 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 02 08:32:19 compute-0 podman[224148]: 2025-10-02 08:32:19.226548448 +0000 UTC m=+0.119152815 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 02 08:32:19 compute-0 podman[224146]: 2025-10-02 08:32:19.278815998 +0000 UTC m=+0.176300457 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:32:22 compute-0 nova_compute[192567]: 2025-10-02 08:32:22.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:22 compute-0 nova_compute[192567]: 2025-10-02 08:32:22.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:25 compute-0 podman[224231]: 2025-10-02 08:32:25.16154231 +0000 UTC m=+0.076236086 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:32:27 compute-0 nova_compute[192567]: 2025-10-02 08:32:27.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:27 compute-0 nova_compute[192567]: 2025-10-02 08:32:27.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:32:29.266 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:32:29 compute-0 nova_compute[192567]: 2025-10-02 08:32:29.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:32:29.268 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:32:29 compute-0 podman[203011]: time="2025-10-02T08:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:32:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:32:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 02 08:32:31 compute-0 openstack_network_exporter[205118]: ERROR   08:32:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:32:31 compute-0 openstack_network_exporter[205118]: ERROR   08:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:32:31 compute-0 openstack_network_exporter[205118]: ERROR   08:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:32:31 compute-0 openstack_network_exporter[205118]: ERROR   08:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:32:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:32:31 compute-0 openstack_network_exporter[205118]: ERROR   08:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:32:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:32:32 compute-0 nova_compute[192567]: 2025-10-02 08:32:32.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:32 compute-0 nova_compute[192567]: 2025-10-02 08:32:32.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:37 compute-0 nova_compute[192567]: 2025-10-02 08:32:37.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:37 compute-0 podman[224255]: 2025-10-02 08:32:37.180394426 +0000 UTC m=+0.086708593 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 02 08:32:37 compute-0 nova_compute[192567]: 2025-10-02 08:32:37.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:39 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:32:39.272 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:32:40 compute-0 nova_compute[192567]: 2025-10-02 08:32:40.638 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:32:42 compute-0 nova_compute[192567]: 2025-10-02 08:32:42.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:42 compute-0 nova_compute[192567]: 2025-10-02 08:32:42.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:45 compute-0 nova_compute[192567]: 2025-10-02 08:32:45.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:32:45 compute-0 nova_compute[192567]: 2025-10-02 08:32:45.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:32:45 compute-0 nova_compute[192567]: 2025-10-02 08:32:45.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:32:45 compute-0 nova_compute[192567]: 2025-10-02 08:32:45.648 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:32:45 compute-0 nova_compute[192567]: 2025-10-02 08:32:45.649 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:32:45 compute-0 nova_compute[192567]: 2025-10-02 08:32:45.650 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:32:45 compute-0 nova_compute[192567]: 2025-10-02 08:32:45.650 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:32:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:32:45.992 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:32:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:32:45.993 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:32:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:32:45.993 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:32:46 compute-0 nova_compute[192567]: 2025-10-02 08:32:46.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:32:46 compute-0 nova_compute[192567]: 2025-10-02 08:32:46.666 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:32:46 compute-0 nova_compute[192567]: 2025-10-02 08:32:46.667 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:32:46 compute-0 nova_compute[192567]: 2025-10-02 08:32:46.667 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:32:46 compute-0 nova_compute[192567]: 2025-10-02 08:32:46.668 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:32:46 compute-0 nova_compute[192567]: 2025-10-02 08:32:46.926 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:32:46 compute-0 nova_compute[192567]: 2025-10-02 08:32:46.927 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5891MB free_disk=73.46451568603516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:32:46 compute-0 nova_compute[192567]: 2025-10-02 08:32:46.927 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:32:46 compute-0 nova_compute[192567]: 2025-10-02 08:32:46.928 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:32:47 compute-0 nova_compute[192567]: 2025-10-02 08:32:47.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:47 compute-0 nova_compute[192567]: 2025-10-02 08:32:47.090 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:32:47 compute-0 nova_compute[192567]: 2025-10-02 08:32:47.090 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:32:47 compute-0 nova_compute[192567]: 2025-10-02 08:32:47.178 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:32:47 compute-0 nova_compute[192567]: 2025-10-02 08:32:47.195 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:32:47 compute-0 nova_compute[192567]: 2025-10-02 08:32:47.198 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:32:47 compute-0 nova_compute[192567]: 2025-10-02 08:32:47.199 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:32:48 compute-0 nova_compute[192567]: 2025-10-02 08:32:48.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:49 compute-0 nova_compute[192567]: 2025-10-02 08:32:49.199 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:32:49 compute-0 nova_compute[192567]: 2025-10-02 08:32:49.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:32:49 compute-0 nova_compute[192567]: 2025-10-02 08:32:49.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:32:50 compute-0 podman[224278]: 2025-10-02 08:32:50.192506149 +0000 UTC m=+0.086886499 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 02 08:32:50 compute-0 podman[224280]: 2025-10-02 08:32:50.218884851 +0000 UTC m=+0.102128874 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:32:50 compute-0 podman[224281]: 2025-10-02 08:32:50.225077704 +0000 UTC m=+0.096686965 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 02 08:32:50 compute-0 podman[224279]: 2025-10-02 08:32:50.272482731 +0000 UTC m=+0.159443390 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:32:52 compute-0 nova_compute[192567]: 2025-10-02 08:32:52.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:53 compute-0 nova_compute[192567]: 2025-10-02 08:32:53.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:56 compute-0 podman[224356]: 2025-10-02 08:32:56.182803135 +0000 UTC m=+0.088403676 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:32:56 compute-0 nova_compute[192567]: 2025-10-02 08:32:56.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:32:57 compute-0 nova_compute[192567]: 2025-10-02 08:32:57.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:58 compute-0 nova_compute[192567]: 2025-10-02 08:32:58.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:32:59 compute-0 podman[203011]: time="2025-10-02T08:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:32:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:32:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Oct 02 08:33:01 compute-0 openstack_network_exporter[205118]: ERROR   08:33:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:33:01 compute-0 openstack_network_exporter[205118]: ERROR   08:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:33:01 compute-0 openstack_network_exporter[205118]: ERROR   08:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:33:01 compute-0 openstack_network_exporter[205118]: ERROR   08:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:33:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:33:01 compute-0 openstack_network_exporter[205118]: ERROR   08:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:33:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:33:02 compute-0 nova_compute[192567]: 2025-10-02 08:33:02.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:03 compute-0 nova_compute[192567]: 2025-10-02 08:33:03.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:03 compute-0 unix_chkpwd[224382]: password check failed for user (root)
Oct 02 08:33:03 compute-0 sshd-session[224380]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Oct 02 08:33:05 compute-0 sshd-session[224380]: Failed password for root from 193.46.255.217 port 37548 ssh2
Oct 02 08:33:07 compute-0 nova_compute[192567]: 2025-10-02 08:33:07.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:07 compute-0 unix_chkpwd[224385]: password check failed for user (operator)
Oct 02 08:33:07 compute-0 sshd-session[224383]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=172.90.128.97  user=operator
Oct 02 08:33:07 compute-0 unix_chkpwd[224386]: password check failed for user (root)
Oct 02 08:33:08 compute-0 nova_compute[192567]: 2025-10-02 08:33:08.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:08 compute-0 podman[224387]: 2025-10-02 08:33:08.244986052 +0000 UTC m=+0.101845645 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7)
Oct 02 08:33:09 compute-0 sshd-session[224383]: Failed password for operator from 172.90.128.97 port 65520 ssh2
Oct 02 08:33:09 compute-0 sshd-session[224380]: Failed password for root from 193.46.255.217 port 37548 ssh2
Oct 02 08:33:09 compute-0 unix_chkpwd[224408]: password check failed for user (root)
Oct 02 08:33:09 compute-0 sshd-session[224383]: Connection closed by authenticating user operator 172.90.128.97 port 65520 [preauth]
Oct 02 08:33:12 compute-0 nova_compute[192567]: 2025-10-02 08:33:12.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:12 compute-0 sshd-session[224380]: Failed password for root from 193.46.255.217 port 37548 ssh2
Oct 02 08:33:13 compute-0 nova_compute[192567]: 2025-10-02 08:33:13.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:13 compute-0 sshd-session[224380]: Received disconnect from 193.46.255.217 port 37548:11:  [preauth]
Oct 02 08:33:13 compute-0 sshd-session[224380]: Disconnected from authenticating user root 193.46.255.217 port 37548 [preauth]
Oct 02 08:33:13 compute-0 sshd-session[224380]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Oct 02 08:33:14 compute-0 unix_chkpwd[224411]: password check failed for user (root)
Oct 02 08:33:14 compute-0 sshd-session[224409]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Oct 02 08:33:16 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 02 08:33:16 compute-0 sshd-session[224409]: Failed password for root from 193.46.255.217 port 42032 ssh2
Oct 02 08:33:17 compute-0 nova_compute[192567]: 2025-10-02 08:33:17.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:18 compute-0 nova_compute[192567]: 2025-10-02 08:33:18.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:18 compute-0 unix_chkpwd[224413]: password check failed for user (root)
Oct 02 08:33:19 compute-0 ovn_controller[94821]: 2025-10-02T08:33:19Z|00191|memory_trim|INFO|Detected inactivity (last active 30019 ms ago): trimming memory
Oct 02 08:33:20 compute-0 sshd-session[224409]: Failed password for root from 193.46.255.217 port 42032 ssh2
Oct 02 08:33:20 compute-0 unix_chkpwd[224415]: password check failed for user (root)
Oct 02 08:33:21 compute-0 podman[224418]: 2025-10-02 08:33:21.191772759 +0000 UTC m=+0.085005900 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:33:21 compute-0 podman[224416]: 2025-10-02 08:33:21.209491102 +0000 UTC m=+0.109417921 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 02 08:33:21 compute-0 podman[224419]: 2025-10-02 08:33:21.221187486 +0000 UTC m=+0.102124374 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid)
Oct 02 08:33:21 compute-0 podman[224417]: 2025-10-02 08:33:21.238481255 +0000 UTC m=+0.134857574 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller)
Oct 02 08:33:22 compute-0 nova_compute[192567]: 2025-10-02 08:33:22.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:23 compute-0 sshd-session[224409]: Failed password for root from 193.46.255.217 port 42032 ssh2
Oct 02 08:33:23 compute-0 nova_compute[192567]: 2025-10-02 08:33:23.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:25 compute-0 sshd-session[224409]: Received disconnect from 193.46.255.217 port 42032:11:  [preauth]
Oct 02 08:33:25 compute-0 sshd-session[224409]: Disconnected from authenticating user root 193.46.255.217 port 42032 [preauth]
Oct 02 08:33:25 compute-0 sshd-session[224409]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Oct 02 08:33:26 compute-0 unix_chkpwd[224498]: password check failed for user (root)
Oct 02 08:33:26 compute-0 sshd-session[224496]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Oct 02 08:33:27 compute-0 nova_compute[192567]: 2025-10-02 08:33:27.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:27 compute-0 podman[224499]: 2025-10-02 08:33:27.178481254 +0000 UTC m=+0.078713123 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:33:28 compute-0 sshd-session[224496]: Failed password for root from 193.46.255.217 port 55212 ssh2
Oct 02 08:33:28 compute-0 nova_compute[192567]: 2025-10-02 08:33:28.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:29 compute-0 podman[203011]: time="2025-10-02T08:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:33:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:33:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3013 "" "Go-http-client/1.1"
Oct 02 08:33:30 compute-0 unix_chkpwd[224525]: password check failed for user (root)
Oct 02 08:33:31 compute-0 openstack_network_exporter[205118]: ERROR   08:33:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:33:31 compute-0 openstack_network_exporter[205118]: ERROR   08:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:33:31 compute-0 openstack_network_exporter[205118]: ERROR   08:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:33:31 compute-0 openstack_network_exporter[205118]: ERROR   08:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:33:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:33:31 compute-0 openstack_network_exporter[205118]: ERROR   08:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:33:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:33:32 compute-0 nova_compute[192567]: 2025-10-02 08:33:32.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:32 compute-0 sshd-session[224496]: Failed password for root from 193.46.255.217 port 55212 ssh2
Oct 02 08:33:33 compute-0 nova_compute[192567]: 2025-10-02 08:33:33.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:34 compute-0 unix_chkpwd[224526]: password check failed for user (root)
Oct 02 08:33:36 compute-0 sshd-session[224496]: Failed password for root from 193.46.255.217 port 55212 ssh2
Oct 02 08:33:37 compute-0 nova_compute[192567]: 2025-10-02 08:33:37.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:38 compute-0 nova_compute[192567]: 2025-10-02 08:33:38.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:38 compute-0 sshd-session[224496]: Received disconnect from 193.46.255.217 port 55212:11:  [preauth]
Oct 02 08:33:38 compute-0 sshd-session[224496]: Disconnected from authenticating user root 193.46.255.217 port 55212 [preauth]
Oct 02 08:33:38 compute-0 sshd-session[224496]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.217  user=root
Oct 02 08:33:39 compute-0 podman[224527]: 2025-10-02 08:33:39.181645481 +0000 UTC m=+0.090169242 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.expose-services=, container_name=openstack_network_exporter)
Oct 02 08:33:42 compute-0 nova_compute[192567]: 2025-10-02 08:33:42.070 2 DEBUG nova.virt.libvirt.driver [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Creating tmpfile /var/lib/nova/instances/tmpo8gqv2ig to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Oct 02 08:33:42 compute-0 nova_compute[192567]: 2025-10-02 08:33:42.071 2 DEBUG nova.compute.manager [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpo8gqv2ig',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Oct 02 08:33:42 compute-0 nova_compute[192567]: 2025-10-02 08:33:42.103 2 DEBUG nova.virt.libvirt.driver [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Creating tmpfile /var/lib/nova/instances/tmppftgbvmq to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Oct 02 08:33:42 compute-0 nova_compute[192567]: 2025-10-02 08:33:42.104 2 DEBUG nova.compute.manager [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppftgbvmq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Oct 02 08:33:42 compute-0 nova_compute[192567]: 2025-10-02 08:33:42.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:42 compute-0 nova_compute[192567]: 2025-10-02 08:33:42.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:33:43 compute-0 nova_compute[192567]: 2025-10-02 08:33:43.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:44 compute-0 nova_compute[192567]: 2025-10-02 08:33:44.788 2 DEBUG nova.compute.manager [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpo8gqv2ig',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='33ca8439-ed92-4aa7-a1ac-1f387ded8f6a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Oct 02 08:33:44 compute-0 nova_compute[192567]: 2025-10-02 08:33:44.808 2 DEBUG oslo_concurrency.lockutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-33ca8439-ed92-4aa7-a1ac-1f387ded8f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:33:44 compute-0 nova_compute[192567]: 2025-10-02 08:33:44.808 2 DEBUG oslo_concurrency.lockutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-33ca8439-ed92-4aa7-a1ac-1f387ded8f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:33:44 compute-0 nova_compute[192567]: 2025-10-02 08:33:44.809 2 DEBUG nova.network.neutron [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:33:45 compute-0 nova_compute[192567]: 2025-10-02 08:33:45.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:33:45 compute-0 nova_compute[192567]: 2025-10-02 08:33:45.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:33:45 compute-0 nova_compute[192567]: 2025-10-02 08:33:45.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:33:45 compute-0 nova_compute[192567]: 2025-10-02 08:33:45.650 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:33:45 compute-0 nova_compute[192567]: 2025-10-02 08:33:45.651 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:33:45 compute-0 nova_compute[192567]: 2025-10-02 08:33:45.652 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:33:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:45.992 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:33:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:45.993 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:33:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:45.993 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.657 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.658 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.659 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.659 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.705 2 DEBUG nova.network.neutron [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Updating instance_info_cache with network_info: [{"id": "1c491f18-8c69-4b88-97a1-a2d47fcdb0ae", "address": "fa:16:3e:21:41:32", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c491f18-8c", "ovs_interfaceid": "1c491f18-8c69-4b88-97a1-a2d47fcdb0ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.737 2 DEBUG oslo_concurrency.lockutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-33ca8439-ed92-4aa7-a1ac-1f387ded8f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.740 2 DEBUG nova.virt.libvirt.driver [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpo8gqv2ig',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='33ca8439-ed92-4aa7-a1ac-1f387ded8f6a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.741 2 DEBUG nova.virt.libvirt.driver [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Creating instance directory: /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.741 2 DEBUG nova.virt.libvirt.driver [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Creating disk.info with the contents: {'/var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk': 'qcow2', '/var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.742 2 DEBUG nova.virt.libvirt.driver [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.743 2 DEBUG nova.objects.instance [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.791 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.885 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.887 2 DEBUG oslo_concurrency.lockutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.888 2 DEBUG oslo_concurrency.lockutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:33:46 compute-0 nova_compute[192567]: 2025-10-02 08:33:46.916 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.037 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.038 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.079 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.082 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5886MB free_disk=73.4645004272461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.083 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.083 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.092 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.094 2 DEBUG oslo_concurrency.lockutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.095 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.155 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Migration for instance 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.156 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Migration for instance e6c13926-8e85-4724-a812-5029db261405 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.178 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.179 2 DEBUG nova.virt.disk.api [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.180 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.237 2 INFO nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Updating resource usage from migration bbbc8da8-dec1-4269-b013-185f1cb226ae
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.238 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Starting to track incoming migration bbbc8da8-dec1-4269-b013-185f1cb226ae with flavor 932d352e-81e8-4137-94d3-19616d5c2ae2 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.242 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.243 2 DEBUG nova.virt.disk.api [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.243 2 DEBUG nova.objects.instance [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.253 2 INFO nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: e6c13926-8e85-4724-a812-5029db261405] Updating resource usage from migration d11a637f-f5ce-4037-a03b-c159283fd13a
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.254 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: e6c13926-8e85-4724-a812-5029db261405] Starting to track incoming migration d11a637f-f5ce-4037-a03b-c159283fd13a with flavor 932d352e-81e8-4137-94d3-19616d5c2ae2 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.257 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.291 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk.config 485376" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.293 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk.config to /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.294 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk.config /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.402 2 WARNING nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.422 2 WARNING nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance e6c13926-8e85-4724-a812-5029db261405 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.422 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.423 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.493 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.510 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.527 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.527 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.444s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.881 2 DEBUG oslo_concurrency.processutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a/disk.config /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a" returned: 0 in 0.588s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.883 2 DEBUG nova.virt.libvirt.driver [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.884 2 DEBUG nova.virt.libvirt.vif [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:32:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-867953464',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-867953464',id=24,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:32:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-xw0wlb0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:32:55Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=33ca8439-ed92-4aa7-a1ac-1f387ded8f6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1c491f18-8c69-4b88-97a1-a2d47fcdb0ae", "address": "fa:16:3e:21:41:32", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1c491f18-8c", "ovs_interfaceid": "1c491f18-8c69-4b88-97a1-a2d47fcdb0ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.884 2 DEBUG nova.network.os_vif_util [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "1c491f18-8c69-4b88-97a1-a2d47fcdb0ae", "address": "fa:16:3e:21:41:32", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1c491f18-8c", "ovs_interfaceid": "1c491f18-8c69-4b88-97a1-a2d47fcdb0ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.885 2 DEBUG nova.network.os_vif_util [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:41:32,bridge_name='br-int',has_traffic_filtering=True,id=1c491f18-8c69-4b88-97a1-a2d47fcdb0ae,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c491f18-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.886 2 DEBUG os_vif [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:41:32,bridge_name='br-int',has_traffic_filtering=True,id=1c491f18-8c69-4b88-97a1-a2d47fcdb0ae,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c491f18-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.888 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.888 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.892 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c491f18-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.893 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1c491f18-8c, col_values=(('external_ids', {'iface-id': '1c491f18-8c69-4b88-97a1-a2d47fcdb0ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:21:41:32', 'vm-uuid': '33ca8439-ed92-4aa7-a1ac-1f387ded8f6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:47 compute-0 NetworkManager[51654]: <info>  [1759394027.8971] manager: (tap1c491f18-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.908 2 INFO os_vif [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:41:32,bridge_name='br-int',has_traffic_filtering=True,id=1c491f18-8c69-4b88-97a1-a2d47fcdb0ae,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c491f18-8c')
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.908 2 DEBUG nova.virt.libvirt.driver [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Oct 02 08:33:47 compute-0 nova_compute[192567]: 2025-10-02 08:33:47.909 2 DEBUG nova.compute.manager [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpo8gqv2ig',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='33ca8439-ed92-4aa7-a1ac-1f387ded8f6a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Oct 02 08:33:48 compute-0 nova_compute[192567]: 2025-10-02 08:33:48.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:48 compute-0 nova_compute[192567]: 2025-10-02 08:33:48.528 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:33:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:48.578 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:33:48 compute-0 nova_compute[192567]: 2025-10-02 08:33:48.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:48 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:48.580 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:33:48 compute-0 nova_compute[192567]: 2025-10-02 08:33:48.980 2 DEBUG nova.network.neutron [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Port 1c491f18-8c69-4b88-97a1-a2d47fcdb0ae updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Oct 02 08:33:48 compute-0 nova_compute[192567]: 2025-10-02 08:33:48.983 2 DEBUG nova.compute.manager [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpo8gqv2ig',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='33ca8439-ed92-4aa7-a1ac-1f387ded8f6a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Oct 02 08:33:49 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 02 08:33:49 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 02 08:33:49 compute-0 kernel: tap1c491f18-8c: entered promiscuous mode
Oct 02 08:33:49 compute-0 NetworkManager[51654]: <info>  [1759394029.3770] manager: (tap1c491f18-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Oct 02 08:33:49 compute-0 nova_compute[192567]: 2025-10-02 08:33:49.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:49 compute-0 ovn_controller[94821]: 2025-10-02T08:33:49Z|00192|binding|INFO|Claiming lport 1c491f18-8c69-4b88-97a1-a2d47fcdb0ae for this additional chassis.
Oct 02 08:33:49 compute-0 ovn_controller[94821]: 2025-10-02T08:33:49Z|00193|binding|INFO|1c491f18-8c69-4b88-97a1-a2d47fcdb0ae: Claiming fa:16:3e:21:41:32 10.100.0.8
Oct 02 08:33:49 compute-0 ovn_controller[94821]: 2025-10-02T08:33:49Z|00194|binding|INFO|Setting lport 1c491f18-8c69-4b88-97a1-a2d47fcdb0ae ovn-installed in OVS
Oct 02 08:33:49 compute-0 nova_compute[192567]: 2025-10-02 08:33:49.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:49 compute-0 nova_compute[192567]: 2025-10-02 08:33:49.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:49 compute-0 systemd-udevd[224603]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:33:49 compute-0 systemd-machined[152597]: New machine qemu-18-instance-00000018.
Oct 02 08:33:49 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000018.
Oct 02 08:33:49 compute-0 NetworkManager[51654]: <info>  [1759394029.4666] device (tap1c491f18-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:33:49 compute-0 NetworkManager[51654]: <info>  [1759394029.4692] device (tap1c491f18-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:33:49 compute-0 nova_compute[192567]: 2025-10-02 08:33:49.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:33:50 compute-0 nova_compute[192567]: 2025-10-02 08:33:50.567 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394030.5664306, 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:33:50 compute-0 nova_compute[192567]: 2025-10-02 08:33:50.567 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] VM Started (Lifecycle Event)
Oct 02 08:33:50 compute-0 nova_compute[192567]: 2025-10-02 08:33:50.626 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:33:51 compute-0 nova_compute[192567]: 2025-10-02 08:33:51.396 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394031.395508, 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:33:51 compute-0 nova_compute[192567]: 2025-10-02 08:33:51.396 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] VM Resumed (Lifecycle Event)
Oct 02 08:33:51 compute-0 nova_compute[192567]: 2025-10-02 08:33:51.417 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:33:51 compute-0 nova_compute[192567]: 2025-10-02 08:33:51.422 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:33:51 compute-0 nova_compute[192567]: 2025-10-02 08:33:51.443 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Oct 02 08:33:51 compute-0 nova_compute[192567]: 2025-10-02 08:33:51.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:33:51 compute-0 nova_compute[192567]: 2025-10-02 08:33:51.623 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:33:52 compute-0 podman[224636]: 2025-10-02 08:33:52.205340226 +0000 UTC m=+0.090008966 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 02 08:33:52 compute-0 podman[224634]: 2025-10-02 08:33:52.218217668 +0000 UTC m=+0.109315899 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct 02 08:33:52 compute-0 podman[224637]: 2025-10-02 08:33:52.218409124 +0000 UTC m=+0.098553363 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:33:52 compute-0 podman[224635]: 2025-10-02 08:33:52.23400384 +0000 UTC m=+0.123382087 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 02 08:33:52 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:52.583 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:33:52 compute-0 nova_compute[192567]: 2025-10-02 08:33:52.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:33:52 compute-0 nova_compute[192567]: 2025-10-02 08:33:52.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:53 compute-0 nova_compute[192567]: 2025-10-02 08:33:53.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:55 compute-0 ovn_controller[94821]: 2025-10-02T08:33:55Z|00195|binding|INFO|Claiming lport 1c491f18-8c69-4b88-97a1-a2d47fcdb0ae for this chassis.
Oct 02 08:33:55 compute-0 ovn_controller[94821]: 2025-10-02T08:33:55Z|00196|binding|INFO|1c491f18-8c69-4b88-97a1-a2d47fcdb0ae: Claiming fa:16:3e:21:41:32 10.100.0.8
Oct 02 08:33:55 compute-0 ovn_controller[94821]: 2025-10-02T08:33:55Z|00197|binding|INFO|Setting lport 1c491f18-8c69-4b88-97a1-a2d47fcdb0ae up in Southbound
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.722 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:41:32 10.100.0.8'], port_security=['fa:16:3e:21:41:32 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '33ca8439-ed92-4aa7-a1ac-1f387ded8f6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=1c491f18-8c69-4b88-97a1-a2d47fcdb0ae) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.723 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 1c491f18-8c69-4b88-97a1-a2d47fcdb0ae in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d bound to our chassis
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.724 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.741 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[5674749f-1118-4de4-870d-36b16d26f396]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.742 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap08b16a0c-b1 in ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.745 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap08b16a0c-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.745 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[9f65dbff-d93c-4498-8d95-67e75b8d1b10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.746 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[552dfdbd-5181-4e1d-9844-33b092e849a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.756 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[16b1761d-c4e6-4bcf-a98e-88466be6be65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.786 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[db05db50-46e1-4cd5-ae44-a5c271b3d57b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.828 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[04e0abda-4137-46c3-8f9f-317b9b616c9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.835 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[314de5c0-082f-446d-b7d7-6051ac9d1498]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:55 compute-0 NetworkManager[51654]: <info>  [1759394035.8378] manager: (tap08b16a0c-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Oct 02 08:33:55 compute-0 systemd-udevd[224725]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:33:55 compute-0 nova_compute[192567]: 2025-10-02 08:33:55.866 2 INFO nova.compute.manager [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Post operation of migration started
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.886 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[e3005fdf-ae30-4175-95ff-15f8e44b441c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.891 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[b1ee282e-437c-4830-942e-08e61187616e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:55 compute-0 NetworkManager[51654]: <info>  [1759394035.9267] device (tap08b16a0c-b0): carrier: link connected
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.934 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd1ac65-5d7d-489e-8a03-330620dc1032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.957 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[7ea54ca1-1ee8-4ca5-8f0f-7b912c4944ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492351, 'reachable_time': 29075, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224744, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.974 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[aabdecf6-bcdc-48d2-bf82-0f580b987cb5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:c53f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492351, 'tstamp': 492351}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224745, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:55.998 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[9b2a92f9-ef45-47e2-9530-940f07dd1573]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492351, 'reachable_time': 29075, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224746, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:56.045 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e66243-4bab-4c40-977e-b7c067892857]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:56.145 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e9d092e8-558b-43c1-8eb7-e8a97bbb8bb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:56.147 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:56.148 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:56.148 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b16a0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:33:56 compute-0 kernel: tap08b16a0c-b0: entered promiscuous mode
Oct 02 08:33:56 compute-0 NetworkManager[51654]: <info>  [1759394036.1521] manager: (tap08b16a0c-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Oct 02 08:33:56 compute-0 nova_compute[192567]: 2025-10-02 08:33:56.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:56.155 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08b16a0c-b0, col_values=(('external_ids', {'iface-id': '748eef31-77a8-4b04-b6b7-dc0f7cc1cf65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:33:56 compute-0 nova_compute[192567]: 2025-10-02 08:33:56.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:56 compute-0 ovn_controller[94821]: 2025-10-02T08:33:56Z|00198|binding|INFO|Releasing lport 748eef31-77a8-4b04-b6b7-dc0f7cc1cf65 from this chassis (sb_readonly=0)
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:56.159 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:56.161 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[31d78163-489c-4c0a-82b5-c45097b5d32d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:56.162 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/08b16a0c-b69f-4a34-9bfe-830099adfe8d.pid.haproxy
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:33:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:33:56.163 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'env', 'PROCESS_TAG=haproxy-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/08b16a0c-b69f-4a34-9bfe-830099adfe8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:33:56 compute-0 nova_compute[192567]: 2025-10-02 08:33:56.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:56 compute-0 nova_compute[192567]: 2025-10-02 08:33:56.294 2 DEBUG oslo_concurrency.lockutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-33ca8439-ed92-4aa7-a1ac-1f387ded8f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:33:56 compute-0 nova_compute[192567]: 2025-10-02 08:33:56.295 2 DEBUG oslo_concurrency.lockutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-33ca8439-ed92-4aa7-a1ac-1f387ded8f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:33:56 compute-0 nova_compute[192567]: 2025-10-02 08:33:56.296 2 DEBUG nova.network.neutron [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:33:56 compute-0 podman[224779]: 2025-10-02 08:33:56.62378835 +0000 UTC m=+0.094024481 container create dedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:33:56 compute-0 podman[224779]: 2025-10-02 08:33:56.578268951 +0000 UTC m=+0.048505132 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:33:56 compute-0 systemd[1]: Started libpod-conmon-dedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8.scope.
Oct 02 08:33:56 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:33:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d235e02196bff33056fe0c1cac48e5ead6005983543da9958f41cd78d2bd76e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:33:56 compute-0 podman[224779]: 2025-10-02 08:33:56.748815307 +0000 UTC m=+0.219051488 container init dedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct 02 08:33:56 compute-0 podman[224779]: 2025-10-02 08:33:56.758724396 +0000 UTC m=+0.228960547 container start dedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct 02 08:33:56 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[224795]: [NOTICE]   (224799) : New worker (224801) forked
Oct 02 08:33:56 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[224795]: [NOTICE]   (224799) : Loading success.
Oct 02 08:33:57 compute-0 nova_compute[192567]: 2025-10-02 08:33:57.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:58 compute-0 podman[224810]: 2025-10-02 08:33:58.179429667 +0000 UTC m=+0.085983321 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:33:58 compute-0 nova_compute[192567]: 2025-10-02 08:33:58.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:33:58 compute-0 nova_compute[192567]: 2025-10-02 08:33:58.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:33:59 compute-0 nova_compute[192567]: 2025-10-02 08:33:59.356 2 DEBUG nova.network.neutron [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Updating instance_info_cache with network_info: [{"id": "1c491f18-8c69-4b88-97a1-a2d47fcdb0ae", "address": "fa:16:3e:21:41:32", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c491f18-8c", "ovs_interfaceid": "1c491f18-8c69-4b88-97a1-a2d47fcdb0ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:33:59 compute-0 nova_compute[192567]: 2025-10-02 08:33:59.383 2 DEBUG oslo_concurrency.lockutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-33ca8439-ed92-4aa7-a1ac-1f387ded8f6a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:33:59 compute-0 nova_compute[192567]: 2025-10-02 08:33:59.409 2 DEBUG oslo_concurrency.lockutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:33:59 compute-0 nova_compute[192567]: 2025-10-02 08:33:59.410 2 DEBUG oslo_concurrency.lockutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:33:59 compute-0 nova_compute[192567]: 2025-10-02 08:33:59.410 2 DEBUG oslo_concurrency.lockutils [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:33:59 compute-0 nova_compute[192567]: 2025-10-02 08:33:59.418 2 INFO nova.virt.libvirt.driver [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 02 08:33:59 compute-0 virtqemud[192112]: Domain id=18 name='instance-00000018' uuid=33ca8439-ed92-4aa7-a1ac-1f387ded8f6a is tainted: custom-monitor
Oct 02 08:33:59 compute-0 podman[203011]: time="2025-10-02T08:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:33:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:33:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Oct 02 08:34:00 compute-0 nova_compute[192567]: 2025-10-02 08:34:00.430 2 INFO nova.virt.libvirt.driver [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 02 08:34:01 compute-0 openstack_network_exporter[205118]: ERROR   08:34:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:34:01 compute-0 openstack_network_exporter[205118]: ERROR   08:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:34:01 compute-0 openstack_network_exporter[205118]: ERROR   08:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:34:01 compute-0 openstack_network_exporter[205118]: ERROR   08:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:34:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:34:01 compute-0 openstack_network_exporter[205118]: ERROR   08:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:34:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:34:01 compute-0 nova_compute[192567]: 2025-10-02 08:34:01.438 2 INFO nova.virt.libvirt.driver [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 02 08:34:01 compute-0 nova_compute[192567]: 2025-10-02 08:34:01.445 2 DEBUG nova.compute.manager [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:34:01 compute-0 nova_compute[192567]: 2025-10-02 08:34:01.476 2 DEBUG nova.objects.instance [None req-85e4758c-919b-4eaa-942d-8ac34334081a f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Oct 02 08:34:02 compute-0 nova_compute[192567]: 2025-10-02 08:34:02.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:03 compute-0 nova_compute[192567]: 2025-10-02 08:34:03.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:03 compute-0 nova_compute[192567]: 2025-10-02 08:34:03.519 2 DEBUG nova.compute.manager [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppftgbvmq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e6c13926-8e85-4724-a812-5029db261405',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Oct 02 08:34:03 compute-0 nova_compute[192567]: 2025-10-02 08:34:03.557 2 DEBUG oslo_concurrency.lockutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-e6c13926-8e85-4724-a812-5029db261405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:34:03 compute-0 nova_compute[192567]: 2025-10-02 08:34:03.558 2 DEBUG oslo_concurrency.lockutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-e6c13926-8e85-4724-a812-5029db261405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:34:03 compute-0 nova_compute[192567]: 2025-10-02 08:34:03.559 2 DEBUG nova.network.neutron [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:34:04 compute-0 nova_compute[192567]: 2025-10-02 08:34:04.883 2 DEBUG nova.network.neutron [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Updating instance_info_cache with network_info: [{"id": "73ccfee1-8e46-4551-9507-63259e83615c", "address": "fa:16:3e:c2:46:77", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ccfee1-8e", "ovs_interfaceid": "73ccfee1-8e46-4551-9507-63259e83615c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:34:04 compute-0 nova_compute[192567]: 2025-10-02 08:34:04.899 2 DEBUG oslo_concurrency.lockutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-e6c13926-8e85-4724-a812-5029db261405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:34:04 compute-0 nova_compute[192567]: 2025-10-02 08:34:04.901 2 DEBUG nova.virt.libvirt.driver [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppftgbvmq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e6c13926-8e85-4724-a812-5029db261405',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Oct 02 08:34:04 compute-0 nova_compute[192567]: 2025-10-02 08:34:04.902 2 DEBUG nova.virt.libvirt.driver [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Creating instance directory: /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Oct 02 08:34:04 compute-0 nova_compute[192567]: 2025-10-02 08:34:04.902 2 DEBUG nova.virt.libvirt.driver [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Creating disk.info with the contents: {'/var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk': 'qcow2', '/var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Oct 02 08:34:04 compute-0 nova_compute[192567]: 2025-10-02 08:34:04.903 2 DEBUG nova.virt.libvirt.driver [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Oct 02 08:34:04 compute-0 nova_compute[192567]: 2025-10-02 08:34:04.903 2 DEBUG nova.objects.instance [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid e6c13926-8e85-4724-a812-5029db261405 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:34:04 compute-0 nova_compute[192567]: 2025-10-02 08:34:04.934 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.025 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.026 2 DEBUG oslo_concurrency.lockutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.027 2 DEBUG oslo_concurrency.lockutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.039 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.111 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.112 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.151 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.153 2 DEBUG oslo_concurrency.lockutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.153 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.218 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.220 2 DEBUG nova.virt.disk.api [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.221 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.275 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.277 2 DEBUG nova.virt.disk.api [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.277 2 DEBUG nova.objects.instance [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid e6c13926-8e85-4724-a812-5029db261405 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.301 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.328 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk.config 485376" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.330 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk.config to /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.330 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk.config /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.870 2 DEBUG oslo_concurrency.processutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405/disk.config /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405" returned: 0 in 0.540s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.872 2 DEBUG nova.virt.libvirt.driver [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.875 2 DEBUG nova.virt.libvirt.vif [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:32:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1789825950',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1789825950',id=23,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:32:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-mqo9fkha',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:32:37Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=e6c13926-8e85-4724-a812-5029db261405,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73ccfee1-8e46-4551-9507-63259e83615c", "address": "fa:16:3e:c2:46:77", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap73ccfee1-8e", "ovs_interfaceid": "73ccfee1-8e46-4551-9507-63259e83615c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.875 2 DEBUG nova.network.os_vif_util [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "73ccfee1-8e46-4551-9507-63259e83615c", "address": "fa:16:3e:c2:46:77", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap73ccfee1-8e", "ovs_interfaceid": "73ccfee1-8e46-4551-9507-63259e83615c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.877 2 DEBUG nova.network.os_vif_util [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:46:77,bridge_name='br-int',has_traffic_filtering=True,id=73ccfee1-8e46-4551-9507-63259e83615c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ccfee1-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.878 2 DEBUG os_vif [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:46:77,bridge_name='br-int',has_traffic_filtering=True,id=73ccfee1-8e46-4551-9507-63259e83615c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ccfee1-8e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.880 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.881 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.886 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap73ccfee1-8e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.887 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap73ccfee1-8e, col_values=(('external_ids', {'iface-id': '73ccfee1-8e46-4551-9507-63259e83615c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:46:77', 'vm-uuid': 'e6c13926-8e85-4724-a812-5029db261405'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:05 compute-0 NetworkManager[51654]: <info>  [1759394045.8904] manager: (tap73ccfee1-8e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.903 2 INFO os_vif [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:46:77,bridge_name='br-int',has_traffic_filtering=True,id=73ccfee1-8e46-4551-9507-63259e83615c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ccfee1-8e')
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.903 2 DEBUG nova.virt.libvirt.driver [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Oct 02 08:34:05 compute-0 nova_compute[192567]: 2025-10-02 08:34:05.904 2 DEBUG nova.compute.manager [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppftgbvmq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e6c13926-8e85-4724-a812-5029db261405',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Oct 02 08:34:08 compute-0 nova_compute[192567]: 2025-10-02 08:34:08.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:09 compute-0 nova_compute[192567]: 2025-10-02 08:34:09.362 2 DEBUG nova.network.neutron [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Port 73ccfee1-8e46-4551-9507-63259e83615c updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Oct 02 08:34:09 compute-0 nova_compute[192567]: 2025-10-02 08:34:09.365 2 DEBUG nova.compute.manager [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppftgbvmq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e6c13926-8e85-4724-a812-5029db261405',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Oct 02 08:34:09 compute-0 kernel: tap73ccfee1-8e: entered promiscuous mode
Oct 02 08:34:09 compute-0 NetworkManager[51654]: <info>  [1759394049.6823] manager: (tap73ccfee1-8e): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Oct 02 08:34:09 compute-0 ovn_controller[94821]: 2025-10-02T08:34:09Z|00199|binding|INFO|Claiming lport 73ccfee1-8e46-4551-9507-63259e83615c for this additional chassis.
Oct 02 08:34:09 compute-0 ovn_controller[94821]: 2025-10-02T08:34:09Z|00200|binding|INFO|73ccfee1-8e46-4551-9507-63259e83615c: Claiming fa:16:3e:c2:46:77 10.100.0.10
Oct 02 08:34:09 compute-0 systemd-udevd[224893]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:34:09 compute-0 nova_compute[192567]: 2025-10-02 08:34:09.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:09 compute-0 ovn_controller[94821]: 2025-10-02T08:34:09Z|00201|binding|INFO|Setting lport 73ccfee1-8e46-4551-9507-63259e83615c ovn-installed in OVS
Oct 02 08:34:09 compute-0 nova_compute[192567]: 2025-10-02 08:34:09.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:09 compute-0 NetworkManager[51654]: <info>  [1759394049.7494] device (tap73ccfee1-8e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:34:09 compute-0 nova_compute[192567]: 2025-10-02 08:34:09.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:09 compute-0 NetworkManager[51654]: <info>  [1759394049.7507] device (tap73ccfee1-8e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:34:09 compute-0 podman[224860]: 2025-10-02 08:34:09.761179977 +0000 UTC m=+0.160535595 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Oct 02 08:34:09 compute-0 systemd-machined[152597]: New machine qemu-19-instance-00000017.
Oct 02 08:34:09 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000017.
Oct 02 08:34:10 compute-0 nova_compute[192567]: 2025-10-02 08:34:10.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:11 compute-0 nova_compute[192567]: 2025-10-02 08:34:11.290 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394051.289655, e6c13926-8e85-4724-a812-5029db261405 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:34:11 compute-0 nova_compute[192567]: 2025-10-02 08:34:11.290 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: e6c13926-8e85-4724-a812-5029db261405] VM Started (Lifecycle Event)
Oct 02 08:34:11 compute-0 nova_compute[192567]: 2025-10-02 08:34:11.326 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: e6c13926-8e85-4724-a812-5029db261405] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:34:12 compute-0 nova_compute[192567]: 2025-10-02 08:34:12.077 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394052.0769947, e6c13926-8e85-4724-a812-5029db261405 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:34:12 compute-0 nova_compute[192567]: 2025-10-02 08:34:12.078 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: e6c13926-8e85-4724-a812-5029db261405] VM Resumed (Lifecycle Event)
Oct 02 08:34:12 compute-0 nova_compute[192567]: 2025-10-02 08:34:12.107 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: e6c13926-8e85-4724-a812-5029db261405] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:34:12 compute-0 nova_compute[192567]: 2025-10-02 08:34:12.112 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: e6c13926-8e85-4724-a812-5029db261405] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:34:12 compute-0 nova_compute[192567]: 2025-10-02 08:34:12.148 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: e6c13926-8e85-4724-a812-5029db261405] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Oct 02 08:34:13 compute-0 nova_compute[192567]: 2025-10-02 08:34:13.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:13 compute-0 ovn_controller[94821]: 2025-10-02T08:34:13Z|00202|binding|INFO|Claiming lport 73ccfee1-8e46-4551-9507-63259e83615c for this chassis.
Oct 02 08:34:13 compute-0 ovn_controller[94821]: 2025-10-02T08:34:13Z|00203|binding|INFO|73ccfee1-8e46-4551-9507-63259e83615c: Claiming fa:16:3e:c2:46:77 10.100.0.10
Oct 02 08:34:13 compute-0 ovn_controller[94821]: 2025-10-02T08:34:13Z|00204|binding|INFO|Setting lport 73ccfee1-8e46-4551-9507-63259e83615c up in Southbound
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.402 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:46:77 10.100.0.10'], port_security=['fa:16:3e:c2:46:77 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e6c13926-8e85-4724-a812-5029db261405', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=73ccfee1-8e46-4551-9507-63259e83615c) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.405 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 73ccfee1-8e46-4551-9507-63259e83615c in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d bound to our chassis
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.407 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.437 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[16734ea6-0efc-4922-8584-867877f501cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.479 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[7f421da6-9aec-4c85-92e4-36f26cf8b57e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.484 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[3c2dae43-ac22-4821-8eab-9211a28f320d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.534 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[177e120c-c9c1-4cf6-9821-f415897cd55b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:13 compute-0 nova_compute[192567]: 2025-10-02 08:34:13.564 2 INFO nova.compute.manager [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Post operation of migration started
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.573 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[32c2290a-517f-44a7-8524-7adcdf9cb9a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492351, 'reachable_time': 29075, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224933, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.604 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[1b824154-7ae2-4248-b016-8d25ed827274]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap08b16a0c-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492367, 'tstamp': 492367}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224934, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap08b16a0c-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492372, 'tstamp': 492372}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224934, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.607 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:34:13 compute-0 nova_compute[192567]: 2025-10-02 08:34:13.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.613 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b16a0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.614 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.614 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08b16a0c-b0, col_values=(('external_ids', {'iface-id': '748eef31-77a8-4b04-b6b7-dc0f7cc1cf65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:34:13 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:13.615 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:34:13 compute-0 nova_compute[192567]: 2025-10-02 08:34:13.912 2 DEBUG oslo_concurrency.lockutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-e6c13926-8e85-4724-a812-5029db261405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:34:13 compute-0 nova_compute[192567]: 2025-10-02 08:34:13.912 2 DEBUG oslo_concurrency.lockutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-e6c13926-8e85-4724-a812-5029db261405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:34:13 compute-0 nova_compute[192567]: 2025-10-02 08:34:13.913 2 DEBUG nova.network.neutron [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:34:15 compute-0 nova_compute[192567]: 2025-10-02 08:34:15.501 2 DEBUG nova.network.neutron [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Updating instance_info_cache with network_info: [{"id": "73ccfee1-8e46-4551-9507-63259e83615c", "address": "fa:16:3e:c2:46:77", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ccfee1-8e", "ovs_interfaceid": "73ccfee1-8e46-4551-9507-63259e83615c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:34:15 compute-0 nova_compute[192567]: 2025-10-02 08:34:15.519 2 DEBUG oslo_concurrency.lockutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-e6c13926-8e85-4724-a812-5029db261405" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:34:15 compute-0 nova_compute[192567]: 2025-10-02 08:34:15.533 2 DEBUG oslo_concurrency.lockutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:15 compute-0 nova_compute[192567]: 2025-10-02 08:34:15.534 2 DEBUG oslo_concurrency.lockutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:15 compute-0 nova_compute[192567]: 2025-10-02 08:34:15.534 2 DEBUG oslo_concurrency.lockutils [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:15 compute-0 nova_compute[192567]: 2025-10-02 08:34:15.539 2 INFO nova.virt.libvirt.driver [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 02 08:34:15 compute-0 virtqemud[192112]: Domain id=19 name='instance-00000017' uuid=e6c13926-8e85-4724-a812-5029db261405 is tainted: custom-monitor
Oct 02 08:34:15 compute-0 nova_compute[192567]: 2025-10-02 08:34:15.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:16 compute-0 nova_compute[192567]: 2025-10-02 08:34:16.548 2 INFO nova.virt.libvirt.driver [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 02 08:34:17 compute-0 nova_compute[192567]: 2025-10-02 08:34:17.557 2 INFO nova.virt.libvirt.driver [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 02 08:34:17 compute-0 nova_compute[192567]: 2025-10-02 08:34:17.564 2 DEBUG nova.compute.manager [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:34:17 compute-0 nova_compute[192567]: 2025-10-02 08:34:17.586 2 DEBUG nova.objects.instance [None req-78d049ac-3638-4d98-861c-837630e6c381 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Oct 02 08:34:18 compute-0 nova_compute[192567]: 2025-10-02 08:34:18.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:20 compute-0 nova_compute[192567]: 2025-10-02 08:34:20.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:23 compute-0 podman[224936]: 2025-10-02 08:34:23.175525594 +0000 UTC m=+0.085096510 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 02 08:34:23 compute-0 podman[224939]: 2025-10-02 08:34:23.189443941 +0000 UTC m=+0.085515926 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 02 08:34:23 compute-0 podman[224938]: 2025-10-02 08:34:23.201257393 +0000 UTC m=+0.094572783 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 02 08:34:23 compute-0 podman[224937]: 2025-10-02 08:34:23.210952427 +0000 UTC m=+0.120739520 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, config_id=ovn_controller)
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.281 2 DEBUG oslo_concurrency.lockutils [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "33ca8439-ed92-4aa7-a1ac-1f387ded8f6a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.281 2 DEBUG oslo_concurrency.lockutils [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "33ca8439-ed92-4aa7-a1ac-1f387ded8f6a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.282 2 DEBUG oslo_concurrency.lockutils [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "33ca8439-ed92-4aa7-a1ac-1f387ded8f6a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.282 2 DEBUG oslo_concurrency.lockutils [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "33ca8439-ed92-4aa7-a1ac-1f387ded8f6a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.282 2 DEBUG oslo_concurrency.lockutils [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "33ca8439-ed92-4aa7-a1ac-1f387ded8f6a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.284 2 INFO nova.compute.manager [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Terminating instance
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.285 2 DEBUG nova.compute.manager [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:34:23 compute-0 kernel: tap1c491f18-8c (unregistering): left promiscuous mode
Oct 02 08:34:23 compute-0 NetworkManager[51654]: <info>  [1759394063.3130] device (tap1c491f18-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:34:23 compute-0 ovn_controller[94821]: 2025-10-02T08:34:23Z|00205|binding|INFO|Releasing lport 1c491f18-8c69-4b88-97a1-a2d47fcdb0ae from this chassis (sb_readonly=0)
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:23 compute-0 ovn_controller[94821]: 2025-10-02T08:34:23Z|00206|binding|INFO|Setting lport 1c491f18-8c69-4b88-97a1-a2d47fcdb0ae down in Southbound
Oct 02 08:34:23 compute-0 ovn_controller[94821]: 2025-10-02T08:34:23Z|00207|binding|INFO|Removing iface tap1c491f18-8c ovn-installed in OVS
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.340 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:41:32 10.100.0.8'], port_security=['fa:16:3e:21:41:32 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '33ca8439-ed92-4aa7-a1ac-1f387ded8f6a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=1c491f18-8c69-4b88-97a1-a2d47fcdb0ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.341 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 1c491f18-8c69-4b88-97a1-a2d47fcdb0ae in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d unbound from our chassis
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.348 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.372 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[9af72e5d-13f2-4c9d-8e89-84f65457727d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:23 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000018.scope: Deactivated successfully.
Oct 02 08:34:23 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000018.scope: Consumed 3.331s CPU time.
Oct 02 08:34:23 compute-0 systemd-machined[152597]: Machine qemu-18-instance-00000018 terminated.
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.423 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[9835b3c6-d637-43c5-b183-0d5e25f6bc63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.428 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a13825-5fb6-4d65-88e4-f3cffe97b984]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.486 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e642cf-b064-405a-8160-d163587ca8be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.517 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f6fd3e-dad0-408b-9c6c-c4bcc3e75e3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08b16a0c-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:c5:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 43, 'tx_packets': 7, 'rx_bytes': 2302, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 43, 'tx_packets': 7, 'rx_bytes': 2302, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492351, 'reachable_time': 29075, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225029, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.548 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[61ad2b33-a4a8-43ed-b0d7-9a4bfb1023c2]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap08b16a0c-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492367, 'tstamp': 492367}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225037, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap08b16a0c-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492372, 'tstamp': 492372}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225037, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.550 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.559 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b16a0c-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.560 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.561 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08b16a0c-b0, col_values=(('external_ids', {'iface-id': '748eef31-77a8-4b04-b6b7-dc0f7cc1cf65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:34:23 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:23.562 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.583 2 INFO nova.virt.libvirt.driver [-] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Instance destroyed successfully.
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.583 2 DEBUG nova.objects.instance [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'resources' on Instance uuid 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.602 2 DEBUG nova.virt.libvirt.vif [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T08:32:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-867953464',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-867953464',id=24,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:32:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-xw0wlb0j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',clean_attempts='1',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:34:01Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=33ca8439-ed92-4aa7-a1ac-1f387ded8f6a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1c491f18-8c69-4b88-97a1-a2d47fcdb0ae", "address": "fa:16:3e:21:41:32", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c491f18-8c", "ovs_interfaceid": "1c491f18-8c69-4b88-97a1-a2d47fcdb0ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.603 2 DEBUG nova.network.os_vif_util [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "1c491f18-8c69-4b88-97a1-a2d47fcdb0ae", "address": "fa:16:3e:21:41:32", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c491f18-8c", "ovs_interfaceid": "1c491f18-8c69-4b88-97a1-a2d47fcdb0ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.604 2 DEBUG nova.network.os_vif_util [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:21:41:32,bridge_name='br-int',has_traffic_filtering=True,id=1c491f18-8c69-4b88-97a1-a2d47fcdb0ae,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c491f18-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.604 2 DEBUG os_vif [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:41:32,bridge_name='br-int',has_traffic_filtering=True,id=1c491f18-8c69-4b88-97a1-a2d47fcdb0ae,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c491f18-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.608 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c491f18-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.617 2 INFO os_vif [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:41:32,bridge_name='br-int',has_traffic_filtering=True,id=1c491f18-8c69-4b88-97a1-a2d47fcdb0ae,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c491f18-8c')
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.617 2 INFO nova.virt.libvirt.driver [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Deleting instance files /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a_del
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.618 2 INFO nova.virt.libvirt.driver [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Deletion of /var/lib/nova/instances/33ca8439-ed92-4aa7-a1ac-1f387ded8f6a_del complete
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.691 2 INFO nova.compute.manager [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Took 0.41 seconds to destroy the instance on the hypervisor.
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.692 2 DEBUG oslo.service.loopingcall [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.693 2 DEBUG nova.compute.manager [-] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:34:23 compute-0 nova_compute[192567]: 2025-10-02 08:34:23.694 2 DEBUG nova.network.neutron [-] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.316 2 DEBUG nova.network.neutron [-] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.330 2 DEBUG nova.compute.manager [req-95279d93-09ee-415f-9774-fe05cd9ad9e1 req-7cd96862-a1cc-4fd6-890c-d0fcc09f8d2f 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Received event network-vif-unplugged-1c491f18-8c69-4b88-97a1-a2d47fcdb0ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.331 2 DEBUG oslo_concurrency.lockutils [req-95279d93-09ee-415f-9774-fe05cd9ad9e1 req-7cd96862-a1cc-4fd6-890c-d0fcc09f8d2f 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "33ca8439-ed92-4aa7-a1ac-1f387ded8f6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.331 2 DEBUG oslo_concurrency.lockutils [req-95279d93-09ee-415f-9774-fe05cd9ad9e1 req-7cd96862-a1cc-4fd6-890c-d0fcc09f8d2f 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "33ca8439-ed92-4aa7-a1ac-1f387ded8f6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.331 2 DEBUG oslo_concurrency.lockutils [req-95279d93-09ee-415f-9774-fe05cd9ad9e1 req-7cd96862-a1cc-4fd6-890c-d0fcc09f8d2f 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "33ca8439-ed92-4aa7-a1ac-1f387ded8f6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.331 2 DEBUG nova.compute.manager [req-95279d93-09ee-415f-9774-fe05cd9ad9e1 req-7cd96862-a1cc-4fd6-890c-d0fcc09f8d2f 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] No waiting events found dispatching network-vif-unplugged-1c491f18-8c69-4b88-97a1-a2d47fcdb0ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.332 2 DEBUG nova.compute.manager [req-95279d93-09ee-415f-9774-fe05cd9ad9e1 req-7cd96862-a1cc-4fd6-890c-d0fcc09f8d2f 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Received event network-vif-unplugged-1c491f18-8c69-4b88-97a1-a2d47fcdb0ae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.338 2 INFO nova.compute.manager [-] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Took 4.64 seconds to deallocate network for instance.
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.404 2 DEBUG oslo_concurrency.lockutils [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.404 2 DEBUG oslo_concurrency.lockutils [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.411 2 DEBUG oslo_concurrency.lockutils [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.424 2 DEBUG nova.compute.manager [req-478053c0-4db9-48ec-87b8-c8fd088ac783 req-8fc3d0ef-abdc-4813-9a8a-87088a9e945f 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Received event network-vif-deleted-1c491f18-8c69-4b88-97a1-a2d47fcdb0ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.445 2 INFO nova.scheduler.client.report [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Deleted allocations for instance 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a
Oct 02 08:34:28 compute-0 podman[225049]: 2025-10-02 08:34:28.451629192 +0000 UTC m=+0.098590648 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.511 2 DEBUG oslo_concurrency.lockutils [None req-33a8044b-b91d-4f5c-ba8a-3db6a0ff9dae bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "33ca8439-ed92-4aa7-a1ac-1f387ded8f6a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.753 2 DEBUG oslo_concurrency.lockutils [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "e6c13926-8e85-4724-a812-5029db261405" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.754 2 DEBUG oslo_concurrency.lockutils [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "e6c13926-8e85-4724-a812-5029db261405" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.754 2 DEBUG oslo_concurrency.lockutils [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "e6c13926-8e85-4724-a812-5029db261405-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.754 2 DEBUG oslo_concurrency.lockutils [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "e6c13926-8e85-4724-a812-5029db261405-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.754 2 DEBUG oslo_concurrency.lockutils [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "e6c13926-8e85-4724-a812-5029db261405-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.755 2 INFO nova.compute.manager [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Terminating instance
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.756 2 DEBUG nova.compute.manager [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:34:28 compute-0 kernel: tap73ccfee1-8e (unregistering): left promiscuous mode
Oct 02 08:34:28 compute-0 NetworkManager[51654]: <info>  [1759394068.7875] device (tap73ccfee1-8e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:28 compute-0 ovn_controller[94821]: 2025-10-02T08:34:28Z|00208|binding|INFO|Releasing lport 73ccfee1-8e46-4551-9507-63259e83615c from this chassis (sb_readonly=0)
Oct 02 08:34:28 compute-0 ovn_controller[94821]: 2025-10-02T08:34:28Z|00209|binding|INFO|Setting lport 73ccfee1-8e46-4551-9507-63259e83615c down in Southbound
Oct 02 08:34:28 compute-0 ovn_controller[94821]: 2025-10-02T08:34:28Z|00210|binding|INFO|Removing iface tap73ccfee1-8e ovn-installed in OVS
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:28.807 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:46:77 10.100.0.10'], port_security=['fa:16:3e:c2:46:77 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e6c13926-8e85-4724-a812-5029db261405', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ea832b474574009921dff909e4daeaf', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'e77a766d-c240-4cfa-82bc-4e115822b1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=032751ae-b346-4bc8-8a72-10411cf5cf50, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=73ccfee1-8e46-4551-9507-63259e83615c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:34:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:28.809 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 73ccfee1-8e46-4551-9507-63259e83615c in datapath 08b16a0c-b69f-4a34-9bfe-830099adfe8d unbound from our chassis
Oct 02 08:34:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:28.811 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08b16a0c-b69f-4a34-9bfe-830099adfe8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:34:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:28.813 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[564b57ce-a6d6-4e0a-86a3-47ba525ac322]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:28.814 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d namespace which is not needed anymore
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:28 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000017.scope: Deactivated successfully.
Oct 02 08:34:28 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000017.scope: Consumed 2.908s CPU time.
Oct 02 08:34:28 compute-0 systemd-machined[152597]: Machine qemu-19-instance-00000017 terminated.
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:28 compute-0 nova_compute[192567]: 2025-10-02 08:34:28.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:28 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[224795]: [NOTICE]   (224799) : haproxy version is 2.8.14-c23fe91
Oct 02 08:34:28 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[224795]: [NOTICE]   (224799) : path to executable is /usr/sbin/haproxy
Oct 02 08:34:28 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[224795]: [WARNING]  (224799) : Exiting Master process...
Oct 02 08:34:29 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[224795]: [WARNING]  (224799) : Exiting Master process...
Oct 02 08:34:29 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[224795]: [ALERT]    (224799) : Current worker (224801) exited with code 143 (Terminated)
Oct 02 08:34:29 compute-0 neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d[224795]: [WARNING]  (224799) : All workers exited. Exiting... (0)
Oct 02 08:34:29 compute-0 systemd[1]: libpod-dedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8.scope: Deactivated successfully.
Oct 02 08:34:29 compute-0 podman[225099]: 2025-10-02 08:34:29.009628634 +0000 UTC m=+0.059347806 container died dedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.045 2 INFO nova.virt.libvirt.driver [-] [instance: e6c13926-8e85-4724-a812-5029db261405] Instance destroyed successfully.
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.045 2 DEBUG nova.objects.instance [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lazy-loading 'resources' on Instance uuid e6c13926-8e85-4724-a812-5029db261405 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:34:29 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8-userdata-shm.mount: Deactivated successfully.
Oct 02 08:34:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-8d235e02196bff33056fe0c1cac48e5ead6005983543da9958f41cd78d2bd76e-merged.mount: Deactivated successfully.
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.063 2 DEBUG nova.virt.libvirt.vif [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T08:32:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1789825950',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1789825950',id=23,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:32:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ea832b474574009921dff909e4daeaf',ramdisk_id='',reservation_id='r-mqo9fkha',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',clean_attempts='1',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1382092507',owner_user_name='tempest-TestExecuteStrategies-1382092507-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:34:17Z,user_data=None,user_id='bf38fbc8dd7b4c4db6c469a7951b0942',uuid=e6c13926-8e85-4724-a812-5029db261405,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "73ccfee1-8e46-4551-9507-63259e83615c", "address": "fa:16:3e:c2:46:77", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ccfee1-8e", "ovs_interfaceid": "73ccfee1-8e46-4551-9507-63259e83615c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.064 2 DEBUG nova.network.os_vif_util [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converting VIF {"id": "73ccfee1-8e46-4551-9507-63259e83615c", "address": "fa:16:3e:c2:46:77", "network": {"id": "08b16a0c-b69f-4a34-9bfe-830099adfe8d", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-168741141-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "abd3d383a4654ca4a9b9ffa12cb471b0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap73ccfee1-8e", "ovs_interfaceid": "73ccfee1-8e46-4551-9507-63259e83615c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.065 2 DEBUG nova.network.os_vif_util [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:46:77,bridge_name='br-int',has_traffic_filtering=True,id=73ccfee1-8e46-4551-9507-63259e83615c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ccfee1-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.066 2 DEBUG os_vif [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:46:77,bridge_name='br-int',has_traffic_filtering=True,id=73ccfee1-8e46-4551-9507-63259e83615c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ccfee1-8e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:34:29 compute-0 podman[225099]: 2025-10-02 08:34:29.06806316 +0000 UTC m=+0.117782322 container cleanup dedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.068 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap73ccfee1-8e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.079 2 INFO os_vif [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:46:77,bridge_name='br-int',has_traffic_filtering=True,id=73ccfee1-8e46-4551-9507-63259e83615c,network=Network(08b16a0c-b69f-4a34-9bfe-830099adfe8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap73ccfee1-8e')
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.079 2 INFO nova.virt.libvirt.driver [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Deleting instance files /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405_del
Oct 02 08:34:29 compute-0 systemd[1]: libpod-conmon-dedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8.scope: Deactivated successfully.
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.081 2 INFO nova.virt.libvirt.driver [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Deletion of /var/lib/nova/instances/e6c13926-8e85-4724-a812-5029db261405_del complete
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.160 2 INFO nova.compute.manager [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Took 0.40 seconds to destroy the instance on the hypervisor.
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.160 2 DEBUG oslo.service.loopingcall [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.161 2 DEBUG nova.compute.manager [-] [instance: e6c13926-8e85-4724-a812-5029db261405] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.161 2 DEBUG nova.network.neutron [-] [instance: e6c13926-8e85-4724-a812-5029db261405] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:34:29 compute-0 podman[225145]: 2025-10-02 08:34:29.164024065 +0000 UTC m=+0.055401192 container remove dedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:34:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:29.170 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[377b4d18-bc1a-4ba2-94c2-a1232c07898b]: (4, ('Thu Oct  2 08:34:28 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d (dedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8)\ndedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8\nThu Oct  2 08:34:29 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d (dedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8)\ndedf7885bff37761dfc628bc4d1779f2662375e97721e34de7e8c22b9170f4b8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:29.172 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[6b1c2ae3-634a-46f1-b2af-2b8d3d35f162]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:29.174 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b16a0c-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:29 compute-0 kernel: tap08b16a0c-b0: left promiscuous mode
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:29.184 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[a4571d51-194a-485c-9dc9-097298bdb174]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:29.238 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[70ad4cc4-d99b-49db-84cb-ed97e162bfb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:29.239 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[d8096c92-5756-41d5-978e-01b8a22be552]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:29.263 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[92627023-5b7f-4a1e-b567-93c0bf232a83]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492341, 'reachable_time': 18994, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225161, 'error': None, 'target': 'ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:29.267 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-08b16a0c-b69f-4a34-9bfe-830099adfe8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:34:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:29.268 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[1e77cbfe-7a8a-494f-9328-e8e2c594565c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:34:29 compute-0 systemd[1]: run-netns-ovnmeta\x2d08b16a0c\x2db69f\x2d4a34\x2d9bfe\x2d830099adfe8d.mount: Deactivated successfully.
Oct 02 08:34:29 compute-0 podman[203011]: time="2025-10-02T08:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:34:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:34:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.832 2 DEBUG nova.network.neutron [-] [instance: e6c13926-8e85-4724-a812-5029db261405] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.851 2 INFO nova.compute.manager [-] [instance: e6c13926-8e85-4724-a812-5029db261405] Took 0.69 seconds to deallocate network for instance.
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.901 2 DEBUG oslo_concurrency.lockutils [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.902 2 DEBUG oslo_concurrency.lockutils [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.908 2 DEBUG oslo_concurrency.lockutils [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.929 2 INFO nova.scheduler.client.report [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Deleted allocations for instance e6c13926-8e85-4724-a812-5029db261405
Oct 02 08:34:29 compute-0 nova_compute[192567]: 2025-10-02 08:34:29.992 2 DEBUG oslo_concurrency.lockutils [None req-1c285981-7da6-41b4-800e-f9f4201cee97 bf38fbc8dd7b4c4db6c469a7951b0942 1ea832b474574009921dff909e4daeaf - - default default] Lock "e6c13926-8e85-4724-a812-5029db261405" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.457 2 DEBUG nova.compute.manager [req-f85f2328-9d95-4baf-a6e7-991468376045 req-4337033c-16a8-401f-8bdb-fa6f49d088b4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Received event network-vif-plugged-1c491f18-8c69-4b88-97a1-a2d47fcdb0ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.458 2 DEBUG oslo_concurrency.lockutils [req-f85f2328-9d95-4baf-a6e7-991468376045 req-4337033c-16a8-401f-8bdb-fa6f49d088b4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "33ca8439-ed92-4aa7-a1ac-1f387ded8f6a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.458 2 DEBUG oslo_concurrency.lockutils [req-f85f2328-9d95-4baf-a6e7-991468376045 req-4337033c-16a8-401f-8bdb-fa6f49d088b4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "33ca8439-ed92-4aa7-a1ac-1f387ded8f6a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.458 2 DEBUG oslo_concurrency.lockutils [req-f85f2328-9d95-4baf-a6e7-991468376045 req-4337033c-16a8-401f-8bdb-fa6f49d088b4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "33ca8439-ed92-4aa7-a1ac-1f387ded8f6a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.458 2 DEBUG nova.compute.manager [req-f85f2328-9d95-4baf-a6e7-991468376045 req-4337033c-16a8-401f-8bdb-fa6f49d088b4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] No waiting events found dispatching network-vif-plugged-1c491f18-8c69-4b88-97a1-a2d47fcdb0ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.459 2 WARNING nova.compute.manager [req-f85f2328-9d95-4baf-a6e7-991468376045 req-4337033c-16a8-401f-8bdb-fa6f49d088b4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Received unexpected event network-vif-plugged-1c491f18-8c69-4b88-97a1-a2d47fcdb0ae for instance with vm_state deleted and task_state None.
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.459 2 DEBUG nova.compute.manager [req-f85f2328-9d95-4baf-a6e7-991468376045 req-4337033c-16a8-401f-8bdb-fa6f49d088b4 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Received event network-vif-deleted-73ccfee1-8e46-4551-9507-63259e83615c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.542 2 DEBUG nova.compute.manager [req-25420415-60fc-40c4-92e4-2854d1dad523 req-716d7127-ff70-443e-b472-9d81af873fb2 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Received event network-vif-unplugged-73ccfee1-8e46-4551-9507-63259e83615c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.542 2 DEBUG oslo_concurrency.lockutils [req-25420415-60fc-40c4-92e4-2854d1dad523 req-716d7127-ff70-443e-b472-9d81af873fb2 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "e6c13926-8e85-4724-a812-5029db261405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.543 2 DEBUG oslo_concurrency.lockutils [req-25420415-60fc-40c4-92e4-2854d1dad523 req-716d7127-ff70-443e-b472-9d81af873fb2 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "e6c13926-8e85-4724-a812-5029db261405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.543 2 DEBUG oslo_concurrency.lockutils [req-25420415-60fc-40c4-92e4-2854d1dad523 req-716d7127-ff70-443e-b472-9d81af873fb2 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "e6c13926-8e85-4724-a812-5029db261405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.544 2 DEBUG nova.compute.manager [req-25420415-60fc-40c4-92e4-2854d1dad523 req-716d7127-ff70-443e-b472-9d81af873fb2 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] No waiting events found dispatching network-vif-unplugged-73ccfee1-8e46-4551-9507-63259e83615c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.544 2 WARNING nova.compute.manager [req-25420415-60fc-40c4-92e4-2854d1dad523 req-716d7127-ff70-443e-b472-9d81af873fb2 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Received unexpected event network-vif-unplugged-73ccfee1-8e46-4551-9507-63259e83615c for instance with vm_state deleted and task_state None.
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.545 2 DEBUG nova.compute.manager [req-25420415-60fc-40c4-92e4-2854d1dad523 req-716d7127-ff70-443e-b472-9d81af873fb2 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Received event network-vif-plugged-73ccfee1-8e46-4551-9507-63259e83615c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.545 2 DEBUG oslo_concurrency.lockutils [req-25420415-60fc-40c4-92e4-2854d1dad523 req-716d7127-ff70-443e-b472-9d81af873fb2 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "e6c13926-8e85-4724-a812-5029db261405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.545 2 DEBUG oslo_concurrency.lockutils [req-25420415-60fc-40c4-92e4-2854d1dad523 req-716d7127-ff70-443e-b472-9d81af873fb2 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "e6c13926-8e85-4724-a812-5029db261405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.546 2 DEBUG oslo_concurrency.lockutils [req-25420415-60fc-40c4-92e4-2854d1dad523 req-716d7127-ff70-443e-b472-9d81af873fb2 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "e6c13926-8e85-4724-a812-5029db261405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.546 2 DEBUG nova.compute.manager [req-25420415-60fc-40c4-92e4-2854d1dad523 req-716d7127-ff70-443e-b472-9d81af873fb2 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] No waiting events found dispatching network-vif-plugged-73ccfee1-8e46-4551-9507-63259e83615c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:34:30 compute-0 nova_compute[192567]: 2025-10-02 08:34:30.546 2 WARNING nova.compute.manager [req-25420415-60fc-40c4-92e4-2854d1dad523 req-716d7127-ff70-443e-b472-9d81af873fb2 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: e6c13926-8e85-4724-a812-5029db261405] Received unexpected event network-vif-plugged-73ccfee1-8e46-4551-9507-63259e83615c for instance with vm_state deleted and task_state None.
Oct 02 08:34:31 compute-0 openstack_network_exporter[205118]: ERROR   08:34:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:34:31 compute-0 openstack_network_exporter[205118]: ERROR   08:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:34:31 compute-0 openstack_network_exporter[205118]: ERROR   08:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:34:31 compute-0 openstack_network_exporter[205118]: ERROR   08:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:34:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:34:31 compute-0 openstack_network_exporter[205118]: ERROR   08:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:34:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:34:33 compute-0 nova_compute[192567]: 2025-10-02 08:34:33.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:34 compute-0 nova_compute[192567]: 2025-10-02 08:34:34.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:38 compute-0 nova_compute[192567]: 2025-10-02 08:34:38.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:38 compute-0 nova_compute[192567]: 2025-10-02 08:34:38.581 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759394063.5793934, 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:34:38 compute-0 nova_compute[192567]: 2025-10-02 08:34:38.582 2 INFO nova.compute.manager [-] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] VM Stopped (Lifecycle Event)
Oct 02 08:34:38 compute-0 nova_compute[192567]: 2025-10-02 08:34:38.617 2 DEBUG nova.compute.manager [None req-d496d986-d475-4402-b562-1fcf7159f0a3 - - - - - -] [instance: 33ca8439-ed92-4aa7-a1ac-1f387ded8f6a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:34:39 compute-0 nova_compute[192567]: 2025-10-02 08:34:39.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:40 compute-0 podman[225162]: 2025-10-02 08:34:40.186547789 +0000 UTC m=+0.093711146 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 02 08:34:43 compute-0 nova_compute[192567]: 2025-10-02 08:34:43.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:43 compute-0 nova_compute[192567]: 2025-10-02 08:34:43.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:34:44 compute-0 nova_compute[192567]: 2025-10-02 08:34:44.043 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759394069.0426161, e6c13926-8e85-4724-a812-5029db261405 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:34:44 compute-0 nova_compute[192567]: 2025-10-02 08:34:44.044 2 INFO nova.compute.manager [-] [instance: e6c13926-8e85-4724-a812-5029db261405] VM Stopped (Lifecycle Event)
Oct 02 08:34:44 compute-0 nova_compute[192567]: 2025-10-02 08:34:44.068 2 DEBUG nova.compute.manager [None req-36dea58d-a0f4-4660-abba-39366971b2cb - - - - - -] [instance: e6c13926-8e85-4724-a812-5029db261405] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:34:44 compute-0 nova_compute[192567]: 2025-10-02 08:34:44.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:45 compute-0 nova_compute[192567]: 2025-10-02 08:34:45.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:34:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:45.994 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:45.995 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:34:45.995 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:46 compute-0 nova_compute[192567]: 2025-10-02 08:34:46.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:34:46 compute-0 nova_compute[192567]: 2025-10-02 08:34:46.658 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:46 compute-0 nova_compute[192567]: 2025-10-02 08:34:46.659 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:46 compute-0 nova_compute[192567]: 2025-10-02 08:34:46.659 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:46 compute-0 nova_compute[192567]: 2025-10-02 08:34:46.659 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:34:46 compute-0 nova_compute[192567]: 2025-10-02 08:34:46.912 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:34:46 compute-0 nova_compute[192567]: 2025-10-02 08:34:46.915 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5861MB free_disk=73.46451187133789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:34:46 compute-0 nova_compute[192567]: 2025-10-02 08:34:46.915 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:34:46 compute-0 nova_compute[192567]: 2025-10-02 08:34:46.916 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:34:46 compute-0 nova_compute[192567]: 2025-10-02 08:34:46.987 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:34:46 compute-0 nova_compute[192567]: 2025-10-02 08:34:46.987 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:34:47 compute-0 nova_compute[192567]: 2025-10-02 08:34:47.015 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:34:47 compute-0 nova_compute[192567]: 2025-10-02 08:34:47.033 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:34:47 compute-0 nova_compute[192567]: 2025-10-02 08:34:47.061 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:34:47 compute-0 nova_compute[192567]: 2025-10-02 08:34:47.061 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:34:48 compute-0 nova_compute[192567]: 2025-10-02 08:34:48.062 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:34:48 compute-0 nova_compute[192567]: 2025-10-02 08:34:48.063 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:34:48 compute-0 nova_compute[192567]: 2025-10-02 08:34:48.063 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:34:48 compute-0 nova_compute[192567]: 2025-10-02 08:34:48.077 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:34:48 compute-0 nova_compute[192567]: 2025-10-02 08:34:48.077 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:34:48 compute-0 nova_compute[192567]: 2025-10-02 08:34:48.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:49 compute-0 nova_compute[192567]: 2025-10-02 08:34:49.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:49 compute-0 nova_compute[192567]: 2025-10-02 08:34:49.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:34:49 compute-0 nova_compute[192567]: 2025-10-02 08:34:49.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:34:52 compute-0 nova_compute[192567]: 2025-10-02 08:34:52.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:34:52 compute-0 nova_compute[192567]: 2025-10-02 08:34:52.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:34:53 compute-0 nova_compute[192567]: 2025-10-02 08:34:53.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:54 compute-0 nova_compute[192567]: 2025-10-02 08:34:54.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:54 compute-0 podman[225185]: 2025-10-02 08:34:54.20307729 +0000 UTC m=+0.099000241 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 02 08:34:54 compute-0 podman[225188]: 2025-10-02 08:34:54.212045562 +0000 UTC m=+0.093848590 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 02 08:34:54 compute-0 podman[225187]: 2025-10-02 08:34:54.22312035 +0000 UTC m=+0.103923136 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd)
Oct 02 08:34:54 compute-0 podman[225186]: 2025-10-02 08:34:54.245618977 +0000 UTC m=+0.128340344 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:34:58 compute-0 nova_compute[192567]: 2025-10-02 08:34:58.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:59 compute-0 nova_compute[192567]: 2025-10-02 08:34:59.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:34:59 compute-0 podman[225263]: 2025-10-02 08:34:59.176476051 +0000 UTC m=+0.089875845 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:34:59 compute-0 podman[203011]: time="2025-10-02T08:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:34:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:34:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Oct 02 08:34:59 compute-0 ovn_controller[94821]: 2025-10-02T08:34:59Z|00211|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct 02 08:35:00 compute-0 nova_compute[192567]: 2025-10-02 08:35:00.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:35:01 compute-0 openstack_network_exporter[205118]: ERROR   08:35:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:35:01 compute-0 openstack_network_exporter[205118]: ERROR   08:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:35:01 compute-0 openstack_network_exporter[205118]: ERROR   08:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:35:01 compute-0 openstack_network_exporter[205118]: ERROR   08:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:35:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:35:01 compute-0 openstack_network_exporter[205118]: ERROR   08:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:35:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:35:03 compute-0 nova_compute[192567]: 2025-10-02 08:35:03.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:04 compute-0 nova_compute[192567]: 2025-10-02 08:35:04.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:08 compute-0 nova_compute[192567]: 2025-10-02 08:35:08.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:09 compute-0 nova_compute[192567]: 2025-10-02 08:35:09.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:11 compute-0 podman[225288]: 2025-10-02 08:35:11.195627784 +0000 UTC m=+0.097177534 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc.)
Oct 02 08:35:13 compute-0 nova_compute[192567]: 2025-10-02 08:35:13.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:14 compute-0 nova_compute[192567]: 2025-10-02 08:35:14.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:18 compute-0 nova_compute[192567]: 2025-10-02 08:35:18.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:19 compute-0 nova_compute[192567]: 2025-10-02 08:35:19.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:35:19.967 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:35:19 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:35:19.968 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:35:19 compute-0 nova_compute[192567]: 2025-10-02 08:35:19.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:21 compute-0 nova_compute[192567]: 2025-10-02 08:35:21.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:23 compute-0 nova_compute[192567]: 2025-10-02 08:35:23.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:24 compute-0 nova_compute[192567]: 2025-10-02 08:35:24.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:25 compute-0 podman[225312]: 2025-10-02 08:35:25.175185364 +0000 UTC m=+0.080358796 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Oct 02 08:35:25 compute-0 podman[225310]: 2025-10-02 08:35:25.18653683 +0000 UTC m=+0.101061455 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:35:25 compute-0 podman[225318]: 2025-10-02 08:35:25.201526251 +0000 UTC m=+0.095453120 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:35:25 compute-0 podman[225311]: 2025-10-02 08:35:25.24795677 +0000 UTC m=+0.145089769 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 02 08:35:28 compute-0 nova_compute[192567]: 2025-10-02 08:35:28.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:28 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:35:28.972 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:35:29 compute-0 nova_compute[192567]: 2025-10-02 08:35:29.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:29 compute-0 podman[203011]: time="2025-10-02T08:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:35:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:35:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 02 08:35:30 compute-0 podman[225389]: 2025-10-02 08:35:30.185476941 +0000 UTC m=+0.085670193 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:35:31 compute-0 openstack_network_exporter[205118]: ERROR   08:35:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:35:31 compute-0 openstack_network_exporter[205118]: ERROR   08:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:35:31 compute-0 openstack_network_exporter[205118]: ERROR   08:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:35:31 compute-0 openstack_network_exporter[205118]: ERROR   08:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:35:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:35:31 compute-0 openstack_network_exporter[205118]: ERROR   08:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:35:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:35:33 compute-0 nova_compute[192567]: 2025-10-02 08:35:33.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:34 compute-0 nova_compute[192567]: 2025-10-02 08:35:34.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:35 compute-0 sshd-session[225413]: Invalid user admin from 80.94.95.25 port 44841
Oct 02 08:35:35 compute-0 sshd-session[225413]: pam_unix(sshd:auth): check pass; user unknown
Oct 02 08:35:35 compute-0 sshd-session[225413]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.25
Oct 02 08:35:36 compute-0 sshd-session[225413]: Failed password for invalid user admin from 80.94.95.25 port 44841 ssh2
Oct 02 08:35:37 compute-0 sshd-session[225413]: pam_unix(sshd:auth): check pass; user unknown
Oct 02 08:35:38 compute-0 nova_compute[192567]: 2025-10-02 08:35:38.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:39 compute-0 nova_compute[192567]: 2025-10-02 08:35:39.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:39 compute-0 sshd-session[225413]: Failed password for invalid user admin from 80.94.95.25 port 44841 ssh2
Oct 02 08:35:40 compute-0 sshd-session[225413]: pam_unix(sshd:auth): check pass; user unknown
Oct 02 08:35:42 compute-0 podman[225415]: 2025-10-02 08:35:42.184331199 +0000 UTC m=+0.086983934 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350)
Oct 02 08:35:42 compute-0 sshd-session[225413]: Failed password for invalid user admin from 80.94.95.25 port 44841 ssh2
Oct 02 08:35:43 compute-0 sshd-session[225413]: pam_unix(sshd:auth): check pass; user unknown
Oct 02 08:35:43 compute-0 nova_compute[192567]: 2025-10-02 08:35:43.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:43 compute-0 nova_compute[192567]: 2025-10-02 08:35:43.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:35:44 compute-0 nova_compute[192567]: 2025-10-02 08:35:44.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:45 compute-0 sshd-session[225413]: Failed password for invalid user admin from 80.94.95.25 port 44841 ssh2
Oct 02 08:35:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:35:45.997 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:35:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:35:45.998 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:35:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:35:45.999 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:35:46 compute-0 sshd-session[225413]: pam_unix(sshd:auth): check pass; user unknown
Oct 02 08:35:46 compute-0 nova_compute[192567]: 2025-10-02 08:35:46.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:35:47 compute-0 nova_compute[192567]: 2025-10-02 08:35:47.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:35:47 compute-0 nova_compute[192567]: 2025-10-02 08:35:47.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:35:47 compute-0 nova_compute[192567]: 2025-10-02 08:35:47.626 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:35:47 compute-0 nova_compute[192567]: 2025-10-02 08:35:47.640 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:35:47 compute-0 sshd-session[225413]: Failed password for invalid user admin from 80.94.95.25 port 44841 ssh2
Oct 02 08:35:48 compute-0 nova_compute[192567]: 2025-10-02 08:35:48.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:48 compute-0 nova_compute[192567]: 2025-10-02 08:35:48.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:35:48 compute-0 nova_compute[192567]: 2025-10-02 08:35:48.666 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:35:48 compute-0 nova_compute[192567]: 2025-10-02 08:35:48.667 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:35:48 compute-0 nova_compute[192567]: 2025-10-02 08:35:48.667 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:35:48 compute-0 nova_compute[192567]: 2025-10-02 08:35:48.668 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:35:48 compute-0 sshd-session[225413]: Received disconnect from 80.94.95.25 port 44841:11: Bye [preauth]
Oct 02 08:35:48 compute-0 sshd-session[225413]: Disconnected from invalid user admin 80.94.95.25 port 44841 [preauth]
Oct 02 08:35:48 compute-0 sshd-session[225413]: PAM 4 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.25
Oct 02 08:35:48 compute-0 sshd-session[225413]: PAM service(sshd) ignoring max retries; 5 > 3
Oct 02 08:35:48 compute-0 nova_compute[192567]: 2025-10-02 08:35:48.894 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:35:48 compute-0 nova_compute[192567]: 2025-10-02 08:35:48.896 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5876MB free_disk=73.46449279785156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:35:48 compute-0 nova_compute[192567]: 2025-10-02 08:35:48.896 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:35:48 compute-0 nova_compute[192567]: 2025-10-02 08:35:48.897 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:35:49 compute-0 nova_compute[192567]: 2025-10-02 08:35:49.003 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:35:49 compute-0 nova_compute[192567]: 2025-10-02 08:35:49.003 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:35:49 compute-0 nova_compute[192567]: 2025-10-02 08:35:49.029 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing inventories for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 02 08:35:49 compute-0 nova_compute[192567]: 2025-10-02 08:35:49.102 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating ProviderTree inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 02 08:35:49 compute-0 nova_compute[192567]: 2025-10-02 08:35:49.103 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:35:49 compute-0 nova_compute[192567]: 2025-10-02 08:35:49.121 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing aggregate associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 02 08:35:49 compute-0 nova_compute[192567]: 2025-10-02 08:35:49.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:49 compute-0 nova_compute[192567]: 2025-10-02 08:35:49.168 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing trait associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 02 08:35:49 compute-0 nova_compute[192567]: 2025-10-02 08:35:49.193 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:35:49 compute-0 nova_compute[192567]: 2025-10-02 08:35:49.210 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:35:49 compute-0 nova_compute[192567]: 2025-10-02 08:35:49.213 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:35:49 compute-0 nova_compute[192567]: 2025-10-02 08:35:49.213 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:35:50 compute-0 nova_compute[192567]: 2025-10-02 08:35:50.214 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:35:50 compute-0 nova_compute[192567]: 2025-10-02 08:35:50.215 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:35:50 compute-0 nova_compute[192567]: 2025-10-02 08:35:50.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:35:52 compute-0 nova_compute[192567]: 2025-10-02 08:35:52.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:35:52 compute-0 nova_compute[192567]: 2025-10-02 08:35:52.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:35:53 compute-0 nova_compute[192567]: 2025-10-02 08:35:53.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:54 compute-0 nova_compute[192567]: 2025-10-02 08:35:54.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:54 compute-0 nova_compute[192567]: 2025-10-02 08:35:54.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:35:56 compute-0 podman[225436]: 2025-10-02 08:35:56.187786029 +0000 UTC m=+0.092397674 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:35:56 compute-0 podman[225438]: 2025-10-02 08:35:56.223022806 +0000 UTC m=+0.116295605 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd)
Oct 02 08:35:56 compute-0 podman[225439]: 2025-10-02 08:35:56.245650297 +0000 UTC m=+0.130976446 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 02 08:35:56 compute-0 podman[225437]: 2025-10-02 08:35:56.278382845 +0000 UTC m=+0.176457844 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:35:58 compute-0 nova_compute[192567]: 2025-10-02 08:35:58.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:59 compute-0 nova_compute[192567]: 2025-10-02 08:35:59.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:35:59 compute-0 podman[203011]: time="2025-10-02T08:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:35:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:35:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3016 "" "Go-http-client/1.1"
Oct 02 08:35:59 compute-0 ovn_controller[94821]: 2025-10-02T08:35:59Z|00212|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct 02 08:36:00 compute-0 nova_compute[192567]: 2025-10-02 08:36:00.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:36:01 compute-0 podman[225522]: 2025-10-02 08:36:01.177710386 +0000 UTC m=+0.086747647 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:36:01 compute-0 openstack_network_exporter[205118]: ERROR   08:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:36:01 compute-0 openstack_network_exporter[205118]: ERROR   08:36:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:36:01 compute-0 openstack_network_exporter[205118]: ERROR   08:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:36:01 compute-0 openstack_network_exporter[205118]: ERROR   08:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:36:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:36:01 compute-0 openstack_network_exporter[205118]: ERROR   08:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:36:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:36:03 compute-0 nova_compute[192567]: 2025-10-02 08:36:03.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:04 compute-0 nova_compute[192567]: 2025-10-02 08:36:04.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:08 compute-0 nova_compute[192567]: 2025-10-02 08:36:08.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:09 compute-0 nova_compute[192567]: 2025-10-02 08:36:09.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:12 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 02 08:36:12 compute-0 podman[225548]: 2025-10-02 08:36:12.649642259 +0000 UTC m=+0.096366849 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Oct 02 08:36:13 compute-0 nova_compute[192567]: 2025-10-02 08:36:13.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:14 compute-0 nova_compute[192567]: 2025-10-02 08:36:14.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:14 compute-0 nova_compute[192567]: 2025-10-02 08:36:14.797 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:36:14 compute-0 nova_compute[192567]: 2025-10-02 08:36:14.798 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:36:14 compute-0 nova_compute[192567]: 2025-10-02 08:36:14.823 2 DEBUG nova.compute.manager [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 02 08:36:14 compute-0 nova_compute[192567]: 2025-10-02 08:36:14.928 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:36:14 compute-0 nova_compute[192567]: 2025-10-02 08:36:14.929 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:36:14 compute-0 nova_compute[192567]: 2025-10-02 08:36:14.935 2 DEBUG nova.virt.hardware [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:36:14 compute-0 nova_compute[192567]: 2025-10-02 08:36:14.935 2 INFO nova.compute.claims [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.088 2 DEBUG nova.compute.provider_tree [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.103 2 DEBUG nova.scheduler.client.report [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.129 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.130 2 DEBUG nova.compute.manager [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.177 2 DEBUG nova.compute.manager [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.178 2 DEBUG nova.network.neutron [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.196 2 INFO nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.217 2 DEBUG nova.compute.manager [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.309 2 DEBUG nova.compute.manager [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.311 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.312 2 INFO nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Creating image(s)
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.313 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "/var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.314 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "/var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.315 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "/var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.354 2 DEBUG oslo_concurrency.processutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.455 2 DEBUG oslo_concurrency.processutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.456 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.457 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.477 2 DEBUG oslo_concurrency.processutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.557 2 DEBUG oslo_concurrency.processutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.559 2 DEBUG oslo_concurrency.processutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.602 2 DEBUG oslo_concurrency.processutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.603 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.603 2 DEBUG oslo_concurrency.processutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.676 2 DEBUG oslo_concurrency.processutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.677 2 DEBUG nova.virt.disk.api [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Checking if we can resize image /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.678 2 DEBUG oslo_concurrency.processutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.769 2 DEBUG oslo_concurrency.processutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.771 2 DEBUG nova.virt.disk.api [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Cannot resize image /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.772 2 DEBUG nova.objects.instance [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lazy-loading 'migration_context' on Instance uuid b0fe2b2e-e7da-43d6-9db8-920adf3145fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.789 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.789 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Ensure instance console log exists: /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.790 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.790 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.790 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:36:15 compute-0 nova_compute[192567]: 2025-10-02 08:36:15.844 2 DEBUG nova.network.neutron [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Successfully created port: 44e773e5-eb2f-4318-be7b-41cc770a4f02 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 02 08:36:16 compute-0 nova_compute[192567]: 2025-10-02 08:36:16.640 2 DEBUG nova.network.neutron [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Successfully updated port: 44e773e5-eb2f-4318-be7b-41cc770a4f02 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 02 08:36:16 compute-0 nova_compute[192567]: 2025-10-02 08:36:16.661 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "refresh_cache-b0fe2b2e-e7da-43d6-9db8-920adf3145fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:36:16 compute-0 nova_compute[192567]: 2025-10-02 08:36:16.662 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquired lock "refresh_cache-b0fe2b2e-e7da-43d6-9db8-920adf3145fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:36:16 compute-0 nova_compute[192567]: 2025-10-02 08:36:16.662 2 DEBUG nova.network.neutron [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:36:16 compute-0 nova_compute[192567]: 2025-10-02 08:36:16.798 2 DEBUG nova.compute.manager [req-c2e1e2c2-d984-435e-93a7-268d24d17e47 req-72c3d724-7e04-4cc6-8f4d-4dd148169b42 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Received event network-changed-44e773e5-eb2f-4318-be7b-41cc770a4f02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:36:16 compute-0 nova_compute[192567]: 2025-10-02 08:36:16.799 2 DEBUG nova.compute.manager [req-c2e1e2c2-d984-435e-93a7-268d24d17e47 req-72c3d724-7e04-4cc6-8f4d-4dd148169b42 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Refreshing instance network info cache due to event network-changed-44e773e5-eb2f-4318-be7b-41cc770a4f02. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:36:16 compute-0 nova_compute[192567]: 2025-10-02 08:36:16.800 2 DEBUG oslo_concurrency.lockutils [req-c2e1e2c2-d984-435e-93a7-268d24d17e47 req-72c3d724-7e04-4cc6-8f4d-4dd148169b42 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-b0fe2b2e-e7da-43d6-9db8-920adf3145fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:36:16 compute-0 nova_compute[192567]: 2025-10-02 08:36:16.897 2 DEBUG nova.network.neutron [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.417 2 DEBUG nova.network.neutron [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Updating instance_info_cache with network_info: [{"id": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "address": "fa:16:3e:0f:19:c8", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44e773e5-eb", "ovs_interfaceid": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.445 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Releasing lock "refresh_cache-b0fe2b2e-e7da-43d6-9db8-920adf3145fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.445 2 DEBUG nova.compute.manager [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Instance network_info: |[{"id": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "address": "fa:16:3e:0f:19:c8", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44e773e5-eb", "ovs_interfaceid": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.446 2 DEBUG oslo_concurrency.lockutils [req-c2e1e2c2-d984-435e-93a7-268d24d17e47 req-72c3d724-7e04-4cc6-8f4d-4dd148169b42 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-b0fe2b2e-e7da-43d6-9db8-920adf3145fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.446 2 DEBUG nova.network.neutron [req-c2e1e2c2-d984-435e-93a7-268d24d17e47 req-72c3d724-7e04-4cc6-8f4d-4dd148169b42 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Refreshing network info cache for port 44e773e5-eb2f-4318-be7b-41cc770a4f02 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.448 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Start _get_guest_xml network_info=[{"id": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "address": "fa:16:3e:0f:19:c8", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44e773e5-eb", "ovs_interfaceid": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.453 2 WARNING nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.457 2 DEBUG nova.virt.libvirt.host [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.458 2 DEBUG nova.virt.libvirt.host [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.460 2 DEBUG nova.virt.libvirt.host [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.460 2 DEBUG nova.virt.libvirt.host [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.461 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.461 2 DEBUG nova.virt.hardware [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.461 2 DEBUG nova.virt.hardware [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.461 2 DEBUG nova.virt.hardware [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.462 2 DEBUG nova.virt.hardware [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.462 2 DEBUG nova.virt.hardware [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.462 2 DEBUG nova.virt.hardware [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.462 2 DEBUG nova.virt.hardware [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.462 2 DEBUG nova.virt.hardware [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.463 2 DEBUG nova.virt.hardware [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.463 2 DEBUG nova.virt.hardware [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.463 2 DEBUG nova.virt.hardware [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.467 2 DEBUG nova.virt.libvirt.vif [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:36:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-296676066',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-296676066',id=26,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e1769197ed0402d83b04ce749e85a94',ramdisk_id='',reservation_id='r-4sg1kfzm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-314916947',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-314916947-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:36:15Z,user_data=None,user_id='e5ba920e8a0e4b888ef2e7bde621cf10',uuid=b0fe2b2e-e7da-43d6-9db8-920adf3145fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "address": "fa:16:3e:0f:19:c8", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44e773e5-eb", "ovs_interfaceid": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.467 2 DEBUG nova.network.os_vif_util [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Converting VIF {"id": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "address": "fa:16:3e:0f:19:c8", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44e773e5-eb", "ovs_interfaceid": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.468 2 DEBUG nova.network.os_vif_util [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:19:c8,bridge_name='br-int',has_traffic_filtering=True,id=44e773e5-eb2f-4318-be7b-41cc770a4f02,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44e773e5-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.468 2 DEBUG nova.objects.instance [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lazy-loading 'pci_devices' on Instance uuid b0fe2b2e-e7da-43d6-9db8-920adf3145fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.481 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:36:18 compute-0 nova_compute[192567]:   <uuid>b0fe2b2e-e7da-43d6-9db8-920adf3145fa</uuid>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   <name>instance-0000001a</name>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-296676066</nova:name>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:36:18</nova:creationTime>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:36:18 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:36:18 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:36:18 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:36:18 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:36:18 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:36:18 compute-0 nova_compute[192567]:         <nova:user uuid="e5ba920e8a0e4b888ef2e7bde621cf10">tempest-TestExecuteVmWorkloadBalanceStrategy-314916947-project-admin</nova:user>
Oct 02 08:36:18 compute-0 nova_compute[192567]:         <nova:project uuid="7e1769197ed0402d83b04ce749e85a94">tempest-TestExecuteVmWorkloadBalanceStrategy-314916947</nova:project>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:36:18 compute-0 nova_compute[192567]:         <nova:port uuid="44e773e5-eb2f-4318-be7b-41cc770a4f02">
Oct 02 08:36:18 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <system>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <entry name="serial">b0fe2b2e-e7da-43d6-9db8-920adf3145fa</entry>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <entry name="uuid">b0fe2b2e-e7da-43d6-9db8-920adf3145fa</entry>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     </system>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   <os>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   </os>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   <features>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   </features>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk.config"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:0f:19:c8"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <target dev="tap44e773e5-eb"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/console.log" append="off"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <video>
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     </video>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:36:18 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:36:18 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:36:18 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:36:18 compute-0 nova_compute[192567]: </domain>
Oct 02 08:36:18 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.482 2 DEBUG nova.compute.manager [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Preparing to wait for external event network-vif-plugged-44e773e5-eb2f-4318-be7b-41cc770a4f02 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.482 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.483 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.483 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.483 2 DEBUG nova.virt.libvirt.vif [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:36:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-296676066',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-296676066',id=26,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e1769197ed0402d83b04ce749e85a94',ramdisk_id='',reservation_id='r-4sg1kfzm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-314916947',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-314916947-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:36:15Z,user_data=None,user_id='e5ba920e8a0e4b888ef2e7bde621cf10',uuid=b0fe2b2e-e7da-43d6-9db8-920adf3145fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "address": "fa:16:3e:0f:19:c8", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44e773e5-eb", "ovs_interfaceid": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.484 2 DEBUG nova.network.os_vif_util [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Converting VIF {"id": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "address": "fa:16:3e:0f:19:c8", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44e773e5-eb", "ovs_interfaceid": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.484 2 DEBUG nova.network.os_vif_util [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:19:c8,bridge_name='br-int',has_traffic_filtering=True,id=44e773e5-eb2f-4318-be7b-41cc770a4f02,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44e773e5-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.484 2 DEBUG os_vif [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:19:c8,bridge_name='br-int',has_traffic_filtering=True,id=44e773e5-eb2f-4318-be7b-41cc770a4f02,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44e773e5-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.485 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.485 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.488 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44e773e5-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.488 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap44e773e5-eb, col_values=(('external_ids', {'iface-id': '44e773e5-eb2f-4318-be7b-41cc770a4f02', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0f:19:c8', 'vm-uuid': 'b0fe2b2e-e7da-43d6-9db8-920adf3145fa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:18 compute-0 NetworkManager[51654]: <info>  [1759394178.5474] manager: (tap44e773e5-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.553 2 INFO os_vif [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:19:c8,bridge_name='br-int',has_traffic_filtering=True,id=44e773e5-eb2f-4318-be7b-41cc770a4f02,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44e773e5-eb')
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.593 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.594 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.595 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] No VIF found with MAC fa:16:3e:0f:19:c8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:36:18 compute-0 nova_compute[192567]: 2025-10-02 08:36:18.595 2 INFO nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Using config drive
Oct 02 08:36:20 compute-0 nova_compute[192567]: 2025-10-02 08:36:20.421 2 INFO nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Creating config drive at /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk.config
Oct 02 08:36:20 compute-0 nova_compute[192567]: 2025-10-02 08:36:20.425 2 DEBUG oslo_concurrency.processutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9stg6r0t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:36:20 compute-0 nova_compute[192567]: 2025-10-02 08:36:20.554 2 DEBUG oslo_concurrency.processutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9stg6r0t" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:36:20 compute-0 kernel: tap44e773e5-eb: entered promiscuous mode
Oct 02 08:36:20 compute-0 nova_compute[192567]: 2025-10-02 08:36:20.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:20 compute-0 ovn_controller[94821]: 2025-10-02T08:36:20Z|00213|binding|INFO|Claiming lport 44e773e5-eb2f-4318-be7b-41cc770a4f02 for this chassis.
Oct 02 08:36:20 compute-0 ovn_controller[94821]: 2025-10-02T08:36:20Z|00214|binding|INFO|44e773e5-eb2f-4318-be7b-41cc770a4f02: Claiming fa:16:3e:0f:19:c8 10.100.0.5
Oct 02 08:36:20 compute-0 NetworkManager[51654]: <info>  [1759394180.6456] manager: (tap44e773e5-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Oct 02 08:36:20 compute-0 nova_compute[192567]: 2025-10-02 08:36:20.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:20 compute-0 nova_compute[192567]: 2025-10-02 08:36:20.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.666 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:19:c8 10.100.0.5'], port_security=['fa:16:3e:0f:19:c8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b0fe2b2e-e7da-43d6-9db8-920adf3145fa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e1769197ed0402d83b04ce749e85a94', 'neutron:revision_number': '2', 'neutron:security_group_ids': '57446da9-02a4-4c71-8f97-35915eb59ad9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d33da5a5-f1cf-4e1d-b2d2-a87e57551306, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=44e773e5-eb2f-4318-be7b-41cc770a4f02) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.668 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 44e773e5-eb2f-4318-be7b-41cc770a4f02 in datapath 04c40a40-1a0a-4c9e-b85f-8553f3cc214c bound to our chassis
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.670 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 04c40a40-1a0a-4c9e-b85f-8553f3cc214c
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.690 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[1c44e8eb-e7e2-45b9-b8d8-58930ee6c7e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.691 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap04c40a40-11 in ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:36:20 compute-0 systemd-udevd[225605]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.694 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap04c40a40-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.694 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[b1e5ede4-7a7b-4ade-a34a-9b8425b713c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.695 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[b49adfde-b825-4167-8e43-cf158c89c80c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 systemd-machined[152597]: New machine qemu-20-instance-0000001a.
Oct 02 08:36:20 compute-0 NetworkManager[51654]: <info>  [1759394180.7130] device (tap44e773e5-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:36:20 compute-0 NetworkManager[51654]: <info>  [1759394180.7163] device (tap44e773e5-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.716 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[3159d930-f730-4a96-a353-7fdaa56e1781]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.742 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[30e523a5-c879-403a-9e30-edd34119bd3e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 nova_compute[192567]: 2025-10-02 08:36:20.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:20 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000001a.
Oct 02 08:36:20 compute-0 ovn_controller[94821]: 2025-10-02T08:36:20Z|00215|binding|INFO|Setting lport 44e773e5-eb2f-4318-be7b-41cc770a4f02 ovn-installed in OVS
Oct 02 08:36:20 compute-0 ovn_controller[94821]: 2025-10-02T08:36:20Z|00216|binding|INFO|Setting lport 44e773e5-eb2f-4318-be7b-41cc770a4f02 up in Southbound
Oct 02 08:36:20 compute-0 nova_compute[192567]: 2025-10-02 08:36:20.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.789 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[48b3a0eb-8678-4f08-828f-04603dd32125]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 NetworkManager[51654]: <info>  [1759394180.7977] manager: (tap04c40a40-10): new Veth device (/org/freedesktop/NetworkManager/Devices/85)
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.796 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[08aa3369-4eea-4259-9ca6-681c5ce6a95b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.839 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[43df3217-f063-4127-b45e-8c0e3f724280]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.844 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[f44461f4-4515-4df6-9981-c5a3e3945cae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 NetworkManager[51654]: <info>  [1759394180.8684] device (tap04c40a40-10): carrier: link connected
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.872 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[842641b0-1da9-49e2-b833-50c883d0b43e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.893 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[15a9f1a7-c5f6-4a46-bfdc-d493ab57afce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04c40a40-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9a:61:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506845, 'reachable_time': 39398, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225638, 'error': None, 'target': 'ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.916 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[48d435ef-ae22-4409-b52b-bda162f3ff34]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9a:6179'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506845, 'tstamp': 506845}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225639, 'error': None, 'target': 'ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.943 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[545d09a1-1dcd-4ef8-a20a-7a11814bda45]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04c40a40-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9a:61:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506845, 'reachable_time': 39398, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225640, 'error': None, 'target': 'ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:20 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:20.984 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[814302a6-8d65-4097-9470-97a94c3466f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:21.073 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[13691136-fc51-4c9c-a56d-203d3b2ee9ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:21.074 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04c40a40-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:21.075 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:21.076 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04c40a40-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:36:21 compute-0 NetworkManager[51654]: <info>  [1759394181.0796] manager: (tap04c40a40-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Oct 02 08:36:21 compute-0 kernel: tap04c40a40-10: entered promiscuous mode
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:21.083 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap04c40a40-10, col_values=(('external_ids', {'iface-id': 'c4a7e4d0-d6dc-42e2-8605-1bf264bb43d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:21 compute-0 ovn_controller[94821]: 2025-10-02T08:36:21Z|00217|binding|INFO|Releasing lport c4a7e4d0-d6dc-42e2-8605-1bf264bb43d8 from this chassis (sb_readonly=0)
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:21.112 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/04c40a40-1a0a-4c9e-b85f-8553f3cc214c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/04c40a40-1a0a-4c9e-b85f-8553f3cc214c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:21.114 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[90b0c69c-5c47-4a7c-90a4-ff305ec0c502]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:21.115 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-04c40a40-1a0a-4c9e-b85f-8553f3cc214c
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/04c40a40-1a0a-4c9e-b85f-8553f3cc214c.pid.haproxy
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID 04c40a40-1a0a-4c9e-b85f-8553f3cc214c
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:36:21 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:21.116 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'env', 'PROCESS_TAG=haproxy-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/04c40a40-1a0a-4c9e-b85f-8553f3cc214c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:36:21 compute-0 podman[225679]: 2025-10-02 08:36:21.58765095 +0000 UTC m=+0.079058545 container create b63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.608 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394181.607755, b0fe2b2e-e7da-43d6-9db8-920adf3145fa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.610 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] VM Started (Lifecycle Event)
Oct 02 08:36:21 compute-0 systemd[1]: Started libpod-conmon-b63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588.scope.
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.635 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.638 2 DEBUG nova.compute.manager [req-87af21a7-8553-4ed5-b6d7-4503f5bfb879 req-4203d15e-98a8-489b-b61f-4d2dfa28d55c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Received event network-vif-plugged-44e773e5-eb2f-4318-be7b-41cc770a4f02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.638 2 DEBUG oslo_concurrency.lockutils [req-87af21a7-8553-4ed5-b6d7-4503f5bfb879 req-4203d15e-98a8-489b-b61f-4d2dfa28d55c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.639 2 DEBUG oslo_concurrency.lockutils [req-87af21a7-8553-4ed5-b6d7-4503f5bfb879 req-4203d15e-98a8-489b-b61f-4d2dfa28d55c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.639 2 DEBUG oslo_concurrency.lockutils [req-87af21a7-8553-4ed5-b6d7-4503f5bfb879 req-4203d15e-98a8-489b-b61f-4d2dfa28d55c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.640 2 DEBUG nova.compute.manager [req-87af21a7-8553-4ed5-b6d7-4503f5bfb879 req-4203d15e-98a8-489b-b61f-4d2dfa28d55c 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Processing event network-vif-plugged-44e773e5-eb2f-4318-be7b-41cc770a4f02 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.640 2 DEBUG nova.compute.manager [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:36:21 compute-0 podman[225679]: 2025-10-02 08:36:21.552112343 +0000 UTC m=+0.043520008 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.645 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.646 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.649 2 INFO nova.virt.libvirt.driver [-] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Instance spawned successfully.
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.649 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 02 08:36:21 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.664 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.665 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394181.6085324, b0fe2b2e-e7da-43d6-9db8-920adf3145fa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.665 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] VM Paused (Lifecycle Event)
Oct 02 08:36:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb930d9235ec845fc214b2d790053d2d094993269cd27cbda1a6e33edf2e8083/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.673 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.673 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.673 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.674 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.674 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.674 2 DEBUG nova.virt.libvirt.driver [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:36:21 compute-0 podman[225679]: 2025-10-02 08:36:21.690644785 +0000 UTC m=+0.182052360 container init b63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.695 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:36:21 compute-0 podman[225679]: 2025-10-02 08:36:21.697874493 +0000 UTC m=+0.189282068 container start b63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001)
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.699 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394181.6429849, b0fe2b2e-e7da-43d6-9db8-920adf3145fa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.700 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] VM Resumed (Lifecycle Event)
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.718 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:36:21 compute-0 neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c[225694]: [NOTICE]   (225698) : New worker (225700) forked
Oct 02 08:36:21 compute-0 neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c[225694]: [NOTICE]   (225698) : Loading success.
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.722 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.727 2 INFO nova.compute.manager [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Took 6.42 seconds to spawn the instance on the hypervisor.
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.727 2 DEBUG nova.compute.manager [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.747 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.778 2 INFO nova.compute.manager [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Took 6.89 seconds to build instance.
Oct 02 08:36:21 compute-0 nova_compute[192567]: 2025-10-02 08:36:21.791 2 DEBUG oslo_concurrency.lockutils [None req-4ec55ecd-cc1e-4dec-aa79-ba52c9ff88d9 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.993s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:36:22 compute-0 nova_compute[192567]: 2025-10-02 08:36:22.440 2 DEBUG nova.network.neutron [req-c2e1e2c2-d984-435e-93a7-268d24d17e47 req-72c3d724-7e04-4cc6-8f4d-4dd148169b42 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Updated VIF entry in instance network info cache for port 44e773e5-eb2f-4318-be7b-41cc770a4f02. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:36:22 compute-0 nova_compute[192567]: 2025-10-02 08:36:22.441 2 DEBUG nova.network.neutron [req-c2e1e2c2-d984-435e-93a7-268d24d17e47 req-72c3d724-7e04-4cc6-8f4d-4dd148169b42 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Updating instance_info_cache with network_info: [{"id": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "address": "fa:16:3e:0f:19:c8", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44e773e5-eb", "ovs_interfaceid": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:36:22 compute-0 nova_compute[192567]: 2025-10-02 08:36:22.463 2 DEBUG oslo_concurrency.lockutils [req-c2e1e2c2-d984-435e-93a7-268d24d17e47 req-72c3d724-7e04-4cc6-8f4d-4dd148169b42 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-b0fe2b2e-e7da-43d6-9db8-920adf3145fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:36:23 compute-0 nova_compute[192567]: 2025-10-02 08:36:23.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:23 compute-0 nova_compute[192567]: 2025-10-02 08:36:23.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:36:23 compute-0 nova_compute[192567]: 2025-10-02 08:36:23.756 2 DEBUG nova.compute.manager [req-d5b8479d-30ec-49f3-b43e-9b2b74e6b642 req-3372eddd-e79f-4fe4-b90a-0357e57c9e06 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Received event network-vif-plugged-44e773e5-eb2f-4318-be7b-41cc770a4f02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:36:23 compute-0 nova_compute[192567]: 2025-10-02 08:36:23.757 2 DEBUG oslo_concurrency.lockutils [req-d5b8479d-30ec-49f3-b43e-9b2b74e6b642 req-3372eddd-e79f-4fe4-b90a-0357e57c9e06 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:36:23 compute-0 nova_compute[192567]: 2025-10-02 08:36:23.758 2 DEBUG oslo_concurrency.lockutils [req-d5b8479d-30ec-49f3-b43e-9b2b74e6b642 req-3372eddd-e79f-4fe4-b90a-0357e57c9e06 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:36:23 compute-0 nova_compute[192567]: 2025-10-02 08:36:23.758 2 DEBUG oslo_concurrency.lockutils [req-d5b8479d-30ec-49f3-b43e-9b2b74e6b642 req-3372eddd-e79f-4fe4-b90a-0357e57c9e06 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:36:23 compute-0 nova_compute[192567]: 2025-10-02 08:36:23.759 2 DEBUG nova.compute.manager [req-d5b8479d-30ec-49f3-b43e-9b2b74e6b642 req-3372eddd-e79f-4fe4-b90a-0357e57c9e06 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] No waiting events found dispatching network-vif-plugged-44e773e5-eb2f-4318-be7b-41cc770a4f02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:36:23 compute-0 nova_compute[192567]: 2025-10-02 08:36:23.759 2 WARNING nova.compute.manager [req-d5b8479d-30ec-49f3-b43e-9b2b74e6b642 req-3372eddd-e79f-4fe4-b90a-0357e57c9e06 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Received unexpected event network-vif-plugged-44e773e5-eb2f-4318-be7b-41cc770a4f02 for instance with vm_state active and task_state None.
Oct 02 08:36:27 compute-0 podman[225710]: 2025-10-02 08:36:27.183620307 +0000 UTC m=+0.096637247 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:36:27 compute-0 podman[225709]: 2025-10-02 08:36:27.191224306 +0000 UTC m=+0.098364070 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:36:27 compute-0 podman[225711]: 2025-10-02 08:36:27.207114336 +0000 UTC m=+0.114916461 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:36:27 compute-0 podman[225715]: 2025-10-02 08:36:27.222934893 +0000 UTC m=+0.114371175 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 02 08:36:28 compute-0 nova_compute[192567]: 2025-10-02 08:36:28.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4994-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:36:28 compute-0 nova_compute[192567]: 2025-10-02 08:36:28.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:36:28 compute-0 nova_compute[192567]: 2025-10-02 08:36:28.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 02 08:36:28 compute-0 nova_compute[192567]: 2025-10-02 08:36:28.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:36:28 compute-0 nova_compute[192567]: 2025-10-02 08:36:28.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:28 compute-0 nova_compute[192567]: 2025-10-02 08:36:28.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:36:29 compute-0 podman[203011]: time="2025-10-02T08:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:36:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:36:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3477 "" "Go-http-client/1.1"
Oct 02 08:36:31 compute-0 openstack_network_exporter[205118]: ERROR   08:36:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:36:31 compute-0 openstack_network_exporter[205118]: ERROR   08:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:36:31 compute-0 openstack_network_exporter[205118]: ERROR   08:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:36:31 compute-0 openstack_network_exporter[205118]: ERROR   08:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:36:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:36:31 compute-0 openstack_network_exporter[205118]: ERROR   08:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:36:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:36:32 compute-0 podman[225801]: 2025-10-02 08:36:32.173213885 +0000 UTC m=+0.085172928 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:36:32 compute-0 ovn_controller[94821]: 2025-10-02T08:36:32Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0f:19:c8 10.100.0.5
Oct 02 08:36:32 compute-0 ovn_controller[94821]: 2025-10-02T08:36:32Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0f:19:c8 10.100.0.5
Oct 02 08:36:33 compute-0 nova_compute[192567]: 2025-10-02 08:36:33.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:38 compute-0 nova_compute[192567]: 2025-10-02 08:36:38.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:36:42 compute-0 nova_compute[192567]: 2025-10-02 08:36:42.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:36:42 compute-0 nova_compute[192567]: 2025-10-02 08:36:42.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 02 08:36:42 compute-0 nova_compute[192567]: 2025-10-02 08:36:42.646 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 02 08:36:42 compute-0 nova_compute[192567]: 2025-10-02 08:36:42.646 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:36:42 compute-0 nova_compute[192567]: 2025-10-02 08:36:42.647 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 02 08:36:43 compute-0 podman[225825]: 2025-10-02 08:36:43.186210239 +0000 UTC m=+0.095820993 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=edpm, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:36:43 compute-0 nova_compute[192567]: 2025-10-02 08:36:43.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:45 compute-0 nova_compute[192567]: 2025-10-02 08:36:45.661 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:36:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:45.999 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:36:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:46.000 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:36:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:36:46.000 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:36:47 compute-0 nova_compute[192567]: 2025-10-02 08:36:47.627 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:36:47 compute-0 nova_compute[192567]: 2025-10-02 08:36:47.628 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:36:47 compute-0 nova_compute[192567]: 2025-10-02 08:36:47.628 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:36:48 compute-0 nova_compute[192567]: 2025-10-02 08:36:48.136 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-b0fe2b2e-e7da-43d6-9db8-920adf3145fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:36:48 compute-0 nova_compute[192567]: 2025-10-02 08:36:48.137 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-b0fe2b2e-e7da-43d6-9db8-920adf3145fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:36:48 compute-0 nova_compute[192567]: 2025-10-02 08:36:48.137 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:36:48 compute-0 nova_compute[192567]: 2025-10-02 08:36:48.138 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b0fe2b2e-e7da-43d6-9db8-920adf3145fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:36:48 compute-0 nova_compute[192567]: 2025-10-02 08:36:48.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:36:48 compute-0 nova_compute[192567]: 2025-10-02 08:36:48.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:36:48 compute-0 nova_compute[192567]: 2025-10-02 08:36:48.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 02 08:36:48 compute-0 nova_compute[192567]: 2025-10-02 08:36:48.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:36:48 compute-0 nova_compute[192567]: 2025-10-02 08:36:48.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:48 compute-0 nova_compute[192567]: 2025-10-02 08:36:48.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.278 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Updating instance_info_cache with network_info: [{"id": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "address": "fa:16:3e:0f:19:c8", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44e773e5-eb", "ovs_interfaceid": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.300 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-b0fe2b2e-e7da-43d6-9db8-920adf3145fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.301 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.302 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.302 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.326 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.327 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.328 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.328 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.418 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.477 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.479 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.551 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.783 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.785 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5705MB free_disk=73.43561172485352GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.786 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.786 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.877 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance b0fe2b2e-e7da-43d6-9db8-920adf3145fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.878 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.878 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.943 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.959 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.982 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:36:50 compute-0 nova_compute[192567]: 2025-10-02 08:36:50.982 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:36:51 compute-0 ovn_controller[94821]: 2025-10-02T08:36:51Z|00218|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Oct 02 08:36:51 compute-0 nova_compute[192567]: 2025-10-02 08:36:51.304 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:36:51 compute-0 nova_compute[192567]: 2025-10-02 08:36:51.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:36:51 compute-0 nova_compute[192567]: 2025-10-02 08:36:51.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:36:52 compute-0 nova_compute[192567]: 2025-10-02 08:36:52.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:36:52 compute-0 nova_compute[192567]: 2025-10-02 08:36:52.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:36:53 compute-0 nova_compute[192567]: 2025-10-02 08:36:53.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:36:58 compute-0 podman[225854]: 2025-10-02 08:36:58.216897813 +0000 UTC m=+0.114152887 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:36:58 compute-0 podman[225857]: 2025-10-02 08:36:58.232108471 +0000 UTC m=+0.109405928 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 02 08:36:58 compute-0 podman[225856]: 2025-10-02 08:36:58.255497816 +0000 UTC m=+0.139159553 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:36:58 compute-0 podman[225855]: 2025-10-02 08:36:58.262839737 +0000 UTC m=+0.153844434 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct 02 08:36:58 compute-0 nova_compute[192567]: 2025-10-02 08:36:58.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:36:58 compute-0 nova_compute[192567]: 2025-10-02 08:36:58.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:36:58 compute-0 nova_compute[192567]: 2025-10-02 08:36:58.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 02 08:36:58 compute-0 nova_compute[192567]: 2025-10-02 08:36:58.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:36:58 compute-0 nova_compute[192567]: 2025-10-02 08:36:58.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:36:58 compute-0 nova_compute[192567]: 2025-10-02 08:36:58.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:36:59 compute-0 podman[203011]: time="2025-10-02T08:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:36:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:36:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3471 "" "Go-http-client/1.1"
Oct 02 08:37:00 compute-0 nova_compute[192567]: 2025-10-02 08:37:00.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:37:01 compute-0 openstack_network_exporter[205118]: ERROR   08:37:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:37:01 compute-0 openstack_network_exporter[205118]: ERROR   08:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:37:01 compute-0 openstack_network_exporter[205118]: ERROR   08:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:37:01 compute-0 openstack_network_exporter[205118]: ERROR   08:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:37:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:37:01 compute-0 openstack_network_exporter[205118]: ERROR   08:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:37:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:37:02 compute-0 nova_compute[192567]: 2025-10-02 08:37:02.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:37:03 compute-0 podman[225937]: 2025-10-02 08:37:03.176203579 +0000 UTC m=+0.079639034 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:37:03 compute-0 nova_compute[192567]: 2025-10-02 08:37:03.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:08 compute-0 nova_compute[192567]: 2025-10-02 08:37:08.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:37:08 compute-0 nova_compute[192567]: 2025-10-02 08:37:08.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:37:08 compute-0 nova_compute[192567]: 2025-10-02 08:37:08.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 02 08:37:08 compute-0 nova_compute[192567]: 2025-10-02 08:37:08.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:37:08 compute-0 nova_compute[192567]: 2025-10-02 08:37:08.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:37:08 compute-0 nova_compute[192567]: 2025-10-02 08:37:08.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:13 compute-0 nova_compute[192567]: 2025-10-02 08:37:13.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:37:13 compute-0 nova_compute[192567]: 2025-10-02 08:37:13.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:13 compute-0 nova_compute[192567]: 2025-10-02 08:37:13.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 02 08:37:13 compute-0 nova_compute[192567]: 2025-10-02 08:37:13.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:37:13 compute-0 nova_compute[192567]: 2025-10-02 08:37:13.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:37:13 compute-0 nova_compute[192567]: 2025-10-02 08:37:13.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:14 compute-0 podman[225962]: 2025-10-02 08:37:14.205565057 +0000 UTC m=+0.115149039 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Oct 02 08:37:18 compute-0 nova_compute[192567]: 2025-10-02 08:37:18.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:23 compute-0 nova_compute[192567]: 2025-10-02 08:37:23.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:37:28 compute-0 nova_compute[192567]: 2025-10-02 08:37:28.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:28 compute-0 nova_compute[192567]: 2025-10-02 08:37:28.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:29 compute-0 podman[225987]: 2025-10-02 08:37:29.169664229 +0000 UTC m=+0.080777389 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd)
Oct 02 08:37:29 compute-0 podman[225985]: 2025-10-02 08:37:29.181403408 +0000 UTC m=+0.091535317 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:37:29 compute-0 podman[225988]: 2025-10-02 08:37:29.190849854 +0000 UTC m=+0.088315185 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:37:29 compute-0 podman[225986]: 2025-10-02 08:37:29.243651704 +0000 UTC m=+0.148929390 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:37:29 compute-0 podman[203011]: time="2025-10-02T08:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:37:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:37:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3478 "" "Go-http-client/1.1"
Oct 02 08:37:31 compute-0 openstack_network_exporter[205118]: ERROR   08:37:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:37:31 compute-0 openstack_network_exporter[205118]: ERROR   08:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:37:31 compute-0 openstack_network_exporter[205118]: ERROR   08:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:37:31 compute-0 openstack_network_exporter[205118]: ERROR   08:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:37:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:37:31 compute-0 openstack_network_exporter[205118]: ERROR   08:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:37:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:37:33 compute-0 nova_compute[192567]: 2025-10-02 08:37:33.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:34 compute-0 podman[226083]: 2025-10-02 08:37:34.172986977 +0000 UTC m=+0.079488128 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:37:38 compute-0 nova_compute[192567]: 2025-10-02 08:37:38.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:37:38 compute-0 nova_compute[192567]: 2025-10-02 08:37:38.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:37:38 compute-0 nova_compute[192567]: 2025-10-02 08:37:38.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Oct 02 08:37:38 compute-0 nova_compute[192567]: 2025-10-02 08:37:38.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:37:38 compute-0 nova_compute[192567]: 2025-10-02 08:37:38.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:38 compute-0 nova_compute[192567]: 2025-10-02 08:37:38.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 02 08:37:43 compute-0 nova_compute[192567]: 2025-10-02 08:37:43.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:37:43 compute-0 nova_compute[192567]: 2025-10-02 08:37:43.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:43 compute-0 nova_compute[192567]: 2025-10-02 08:37:43.805 2 DEBUG nova.virt.libvirt.driver [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Creating tmpfile /var/lib/nova/instances/tmph39wgxi6 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Oct 02 08:37:43 compute-0 nova_compute[192567]: 2025-10-02 08:37:43.928 2 DEBUG nova.compute.manager [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph39wgxi6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Oct 02 08:37:45 compute-0 podman[226108]: 2025-10-02 08:37:45.188535031 +0000 UTC m=+0.093752286 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git)
Oct 02 08:37:45 compute-0 nova_compute[192567]: 2025-10-02 08:37:45.227 2 DEBUG nova.compute.manager [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph39wgxi6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Oct 02 08:37:45 compute-0 nova_compute[192567]: 2025-10-02 08:37:45.248 2 DEBUG oslo_concurrency.lockutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:37:45 compute-0 nova_compute[192567]: 2025-10-02 08:37:45.248 2 DEBUG oslo_concurrency.lockutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:37:45 compute-0 nova_compute[192567]: 2025-10-02 08:37:45.248 2 DEBUG nova.network.neutron [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:37:45 compute-0 nova_compute[192567]: 2025-10-02 08:37:45.640 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:37:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:46.002 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:37:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:46.002 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:37:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:46.003 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.591 2 DEBUG nova.network.neutron [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Updating instance_info_cache with network_info: [{"id": "95170647-1bb0-483a-b37b-34ced5ac4541", "address": "fa:16:3e:c1:da:e0", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95170647-1b", "ovs_interfaceid": "95170647-1bb0-483a-b37b-34ced5ac4541", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.611 2 DEBUG oslo_concurrency.lockutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.614 2 DEBUG nova.virt.libvirt.driver [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph39wgxi6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.615 2 DEBUG nova.virt.libvirt.driver [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Creating instance directory: /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.616 2 DEBUG nova.virt.libvirt.driver [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Creating disk.info with the contents: {'/var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk': 'qcow2', '/var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.616 2 DEBUG nova.virt.libvirt.driver [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.617 2 DEBUG nova.objects.instance [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.656 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.747 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.749 2 DEBUG oslo_concurrency.lockutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.751 2 DEBUG oslo_concurrency.lockutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.786 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.871 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.872 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.908 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.910 2 DEBUG oslo_concurrency.lockutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.910 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.964 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.966 2 DEBUG nova.virt.disk.api [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:37:47 compute-0 nova_compute[192567]: 2025-10-02 08:37:47.967 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.053 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.055 2 DEBUG nova.virt.disk.api [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.056 2 DEBUG nova.objects.instance [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.070 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.109 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk.config 485376" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.112 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk.config to /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.113 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk.config /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.627 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.627 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.653 2 DEBUG oslo_concurrency.processutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk.config /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51" returned: 0 in 0.540s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.654 2 DEBUG nova.virt.libvirt.driver [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.656 2 DEBUG nova.virt.libvirt.vif [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:35:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-293803348',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-293803348',id=25,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:36:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7e1769197ed0402d83b04ce749e85a94',ramdisk_id='',reservation_id='r-wv8k5hv2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-314916947',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-314916947-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:36:06Z,user_data=None,user_id='e5ba920e8a0e4b888ef2e7bde621cf10',uuid=1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "95170647-1bb0-483a-b37b-34ced5ac4541", "address": "fa:16:3e:c1:da:e0", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap95170647-1b", "ovs_interfaceid": "95170647-1bb0-483a-b37b-34ced5ac4541", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.657 2 DEBUG nova.network.os_vif_util [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "95170647-1bb0-483a-b37b-34ced5ac4541", "address": "fa:16:3e:c1:da:e0", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap95170647-1b", "ovs_interfaceid": "95170647-1bb0-483a-b37b-34ced5ac4541", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.659 2 DEBUG nova.network.os_vif_util [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:da:e0,bridge_name='br-int',has_traffic_filtering=True,id=95170647-1bb0-483a-b37b-34ced5ac4541,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95170647-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.661 2 DEBUG os_vif [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:da:e0,bridge_name='br-int',has_traffic_filtering=True,id=95170647-1bb0-483a-b37b-34ced5ac4541,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95170647-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.665 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.666 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.678 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap95170647-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.679 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap95170647-1b, col_values=(('external_ids', {'iface-id': '95170647-1bb0-483a-b37b-34ced5ac4541', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:da:e0', 'vm-uuid': '1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:48 compute-0 NetworkManager[51654]: <info>  [1759394268.6827] manager: (tap95170647-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.691 2 INFO os_vif [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:da:e0,bridge_name='br-int',has_traffic_filtering=True,id=95170647-1bb0-483a-b37b-34ced5ac4541,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95170647-1b')
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.692 2 DEBUG nova.virt.libvirt.driver [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.693 2 DEBUG nova.compute.manager [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph39wgxi6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Oct 02 08:37:48 compute-0 nova_compute[192567]: 2025-10-02 08:37:48.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:49 compute-0 nova_compute[192567]: 2025-10-02 08:37:49.607 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-b0fe2b2e-e7da-43d6-9db8-920adf3145fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:37:49 compute-0 nova_compute[192567]: 2025-10-02 08:37:49.608 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-b0fe2b2e-e7da-43d6-9db8-920adf3145fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:37:49 compute-0 nova_compute[192567]: 2025-10-02 08:37:49.609 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:37:49 compute-0 nova_compute[192567]: 2025-10-02 08:37:49.609 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b0fe2b2e-e7da-43d6-9db8-920adf3145fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:37:50 compute-0 nova_compute[192567]: 2025-10-02 08:37:50.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:50.568 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:37:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:50.570 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:37:51 compute-0 nova_compute[192567]: 2025-10-02 08:37:51.679 2 DEBUG nova.network.neutron [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Port 95170647-1bb0-483a-b37b-34ced5ac4541 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Oct 02 08:37:51 compute-0 nova_compute[192567]: 2025-10-02 08:37:51.681 2 DEBUG nova.compute.manager [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph39wgxi6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Oct 02 08:37:51 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 02 08:37:51 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 02 08:37:52 compute-0 kernel: tap95170647-1b: entered promiscuous mode
Oct 02 08:37:52 compute-0 ovn_controller[94821]: 2025-10-02T08:37:52Z|00219|binding|INFO|Claiming lport 95170647-1bb0-483a-b37b-34ced5ac4541 for this additional chassis.
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:52 compute-0 ovn_controller[94821]: 2025-10-02T08:37:52Z|00220|binding|INFO|95170647-1bb0-483a-b37b-34ced5ac4541: Claiming fa:16:3e:c1:da:e0 10.100.0.8
Oct 02 08:37:52 compute-0 NetworkManager[51654]: <info>  [1759394272.0510] manager: (tap95170647-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Oct 02 08:37:52 compute-0 ovn_controller[94821]: 2025-10-02T08:37:52Z|00221|binding|INFO|Setting lport 95170647-1bb0-483a-b37b-34ced5ac4541 ovn-installed in OVS
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:52 compute-0 systemd-udevd[226183]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:37:52 compute-0 systemd-machined[152597]: New machine qemu-21-instance-00000019.
Oct 02 08:37:52 compute-0 NetworkManager[51654]: <info>  [1759394272.1117] device (tap95170647-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:37:52 compute-0 NetworkManager[51654]: <info>  [1759394272.1127] device (tap95170647-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:37:52 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-00000019.
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.600 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Updating instance_info_cache with network_info: [{"id": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "address": "fa:16:3e:0f:19:c8", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44e773e5-eb", "ovs_interfaceid": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.625 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-b0fe2b2e-e7da-43d6-9db8-920adf3145fa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.627 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.627 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.656 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.656 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.657 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.657 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.762 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.853 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.855 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.922 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:37:52 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.933 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:52.999 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.000 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.063 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.238 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394273.2383358, 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.239 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] VM Started (Lifecycle Event)
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.265 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.305 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.306 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5672MB free_disk=73.43562316894531GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.307 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.307 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.357 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Migration for instance 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.377 2 INFO nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Updating resource usage from migration a597282f-7596-4652-9b52-36ed569356e8
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.378 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Starting to track incoming migration a597282f-7596-4652-9b52-36ed569356e8 with flavor 932d352e-81e8-4137-94d3-19616d5c2ae2 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.426 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance b0fe2b2e-e7da-43d6-9db8-920adf3145fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.446 2 WARNING nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.447 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.447 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.626 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.644 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.669 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.670 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.363s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.979 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394273.9790008, 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:37:53 compute-0 nova_compute[192567]: 2025-10-02 08:37:53.982 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] VM Resumed (Lifecycle Event)
Oct 02 08:37:54 compute-0 nova_compute[192567]: 2025-10-02 08:37:54.014 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:37:54 compute-0 nova_compute[192567]: 2025-10-02 08:37:54.018 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:37:54 compute-0 nova_compute[192567]: 2025-10-02 08:37:54.053 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Oct 02 08:37:55 compute-0 ovn_controller[94821]: 2025-10-02T08:37:55Z|00222|binding|INFO|Claiming lport 95170647-1bb0-483a-b37b-34ced5ac4541 for this chassis.
Oct 02 08:37:55 compute-0 ovn_controller[94821]: 2025-10-02T08:37:55Z|00223|binding|INFO|95170647-1bb0-483a-b37b-34ced5ac4541: Claiming fa:16:3e:c1:da:e0 10.100.0.8
Oct 02 08:37:55 compute-0 ovn_controller[94821]: 2025-10-02T08:37:55Z|00224|binding|INFO|Setting lport 95170647-1bb0-483a-b37b-34ced5ac4541 up in Southbound
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.238 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:da:e0 10.100.0.8'], port_security=['fa:16:3e:c1:da:e0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e1769197ed0402d83b04ce749e85a94', 'neutron:revision_number': '11', 'neutron:security_group_ids': '57446da9-02a4-4c71-8f97-35915eb59ad9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d33da5a5-f1cf-4e1d-b2d2-a87e57551306, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=95170647-1bb0-483a-b37b-34ced5ac4541) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.240 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 95170647-1bb0-483a-b37b-34ced5ac4541 in datapath 04c40a40-1a0a-4c9e-b85f-8553f3cc214c bound to our chassis
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.243 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 04c40a40-1a0a-4c9e-b85f-8553f3cc214c
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.266 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd2c773-095c-4130-b27a-8f78fead14da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.302 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[0f9eba6c-94e5-4c18-8d93-1089eb33282c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.307 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[83a8abcc-63fe-4c00-8778-b0a6c940ebe1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.349 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[36f5d3d6-dd3f-47c7-bbc7-7f7811e01e2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.370 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc36390-3119-4302-9d2e-fcdc753c60a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04c40a40-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9a:61:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 5, 'rx_bytes': 1126, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 5, 'rx_bytes': 1126, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506845, 'reachable_time': 39398, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226231, 'error': None, 'target': 'ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.393 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[5beb414d-0577-4d26-b91d-dbabc42e2475]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap04c40a40-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506860, 'tstamp': 506860}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226232, 'error': None, 'target': 'ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap04c40a40-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506865, 'tstamp': 506865}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226232, 'error': None, 'target': 'ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.396 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04c40a40-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:37:55 compute-0 nova_compute[192567]: 2025-10-02 08:37:55.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:55 compute-0 nova_compute[192567]: 2025-10-02 08:37:55.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.402 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04c40a40-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.402 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.403 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap04c40a40-10, col_values=(('external_ids', {'iface-id': 'c4a7e4d0-d6dc-42e2-8605-1bf264bb43d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:37:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:55.404 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:37:55 compute-0 nova_compute[192567]: 2025-10-02 08:37:55.483 2 INFO nova.compute.manager [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Post operation of migration started
Oct 02 08:37:55 compute-0 nova_compute[192567]: 2025-10-02 08:37:55.667 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:37:55 compute-0 nova_compute[192567]: 2025-10-02 08:37:55.694 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:37:55 compute-0 nova_compute[192567]: 2025-10-02 08:37:55.695 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:37:55 compute-0 nova_compute[192567]: 2025-10-02 08:37:55.807 2 DEBUG oslo_concurrency.lockutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:37:55 compute-0 nova_compute[192567]: 2025-10-02 08:37:55.808 2 DEBUG oslo_concurrency.lockutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:37:55 compute-0 nova_compute[192567]: 2025-10-02 08:37:55.808 2 DEBUG nova.network.neutron [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:37:57 compute-0 nova_compute[192567]: 2025-10-02 08:37:57.187 2 DEBUG nova.network.neutron [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Updating instance_info_cache with network_info: [{"id": "95170647-1bb0-483a-b37b-34ced5ac4541", "address": "fa:16:3e:c1:da:e0", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95170647-1b", "ovs_interfaceid": "95170647-1bb0-483a-b37b-34ced5ac4541", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:37:57 compute-0 nova_compute[192567]: 2025-10-02 08:37:57.205 2 DEBUG oslo_concurrency.lockutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:37:57 compute-0 nova_compute[192567]: 2025-10-02 08:37:57.219 2 DEBUG oslo_concurrency.lockutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:37:57 compute-0 nova_compute[192567]: 2025-10-02 08:37:57.219 2 DEBUG oslo_concurrency.lockutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:37:57 compute-0 nova_compute[192567]: 2025-10-02 08:37:57.220 2 DEBUG oslo_concurrency.lockutils [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:37:57 compute-0 nova_compute[192567]: 2025-10-02 08:37:57.226 2 INFO nova.virt.libvirt.driver [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 02 08:37:57 compute-0 virtqemud[192112]: Domain id=21 name='instance-00000019' uuid=1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51 is tainted: custom-monitor
Oct 02 08:37:58 compute-0 nova_compute[192567]: 2025-10-02 08:37:58.234 2 INFO nova.virt.libvirt.driver [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 02 08:37:58 compute-0 nova_compute[192567]: 2025-10-02 08:37:58.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:58 compute-0 nova_compute[192567]: 2025-10-02 08:37:58.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:37:59 compute-0 nova_compute[192567]: 2025-10-02 08:37:59.242 2 INFO nova.virt.libvirt.driver [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 02 08:37:59 compute-0 nova_compute[192567]: 2025-10-02 08:37:59.248 2 DEBUG nova.compute.manager [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:37:59 compute-0 nova_compute[192567]: 2025-10-02 08:37:59.273 2 DEBUG nova.objects.instance [None req-ed803628-6c25-483d-b3f2-017f7bd5481d f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Oct 02 08:37:59 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:37:59.575 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:37:59 compute-0 podman[203011]: time="2025-10-02T08:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:37:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:37:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3476 "" "Go-http-client/1.1"
Oct 02 08:38:00 compute-0 podman[226233]: 2025-10-02 08:38:00.182996007 +0000 UTC m=+0.086492559 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 02 08:38:00 compute-0 podman[226235]: 2025-10-02 08:38:00.184536635 +0000 UTC m=+0.079489198 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 02 08:38:00 compute-0 podman[226236]: 2025-10-02 08:38:00.19360178 +0000 UTC m=+0.084671531 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true)
Oct 02 08:38:00 compute-0 podman[226234]: 2025-10-02 08:38:00.224280624 +0000 UTC m=+0.125617727 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Oct 02 08:38:00 compute-0 nova_compute[192567]: 2025-10-02 08:38:00.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:38:01 compute-0 openstack_network_exporter[205118]: ERROR   08:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:38:01 compute-0 openstack_network_exporter[205118]: ERROR   08:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:38:01 compute-0 openstack_network_exporter[205118]: ERROR   08:38:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:38:01 compute-0 openstack_network_exporter[205118]: ERROR   08:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:38:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:38:01 compute-0 openstack_network_exporter[205118]: ERROR   08:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:38:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.485 2 DEBUG oslo_concurrency.lockutils [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.486 2 DEBUG oslo_concurrency.lockutils [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.487 2 DEBUG oslo_concurrency.lockutils [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.487 2 DEBUG oslo_concurrency.lockutils [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.488 2 DEBUG oslo_concurrency.lockutils [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.489 2 INFO nova.compute.manager [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Terminating instance
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.491 2 DEBUG nova.compute.manager [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:38:02 compute-0 kernel: tap44e773e5-eb (unregistering): left promiscuous mode
Oct 02 08:38:02 compute-0 NetworkManager[51654]: <info>  [1759394282.5244] device (tap44e773e5-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:38:02 compute-0 ovn_controller[94821]: 2025-10-02T08:38:02Z|00225|binding|INFO|Releasing lport 44e773e5-eb2f-4318-be7b-41cc770a4f02 from this chassis (sb_readonly=0)
Oct 02 08:38:02 compute-0 ovn_controller[94821]: 2025-10-02T08:38:02Z|00226|binding|INFO|Setting lport 44e773e5-eb2f-4318-be7b-41cc770a4f02 down in Southbound
Oct 02 08:38:02 compute-0 ovn_controller[94821]: 2025-10-02T08:38:02Z|00227|binding|INFO|Removing iface tap44e773e5-eb ovn-installed in OVS
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.587 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:19:c8 10.100.0.5'], port_security=['fa:16:3e:0f:19:c8 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b0fe2b2e-e7da-43d6-9db8-920adf3145fa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e1769197ed0402d83b04ce749e85a94', 'neutron:revision_number': '4', 'neutron:security_group_ids': '57446da9-02a4-4c71-8f97-35915eb59ad9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d33da5a5-f1cf-4e1d-b2d2-a87e57551306, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=44e773e5-eb2f-4318-be7b-41cc770a4f02) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.588 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 44e773e5-eb2f-4318-be7b-41cc770a4f02 in datapath 04c40a40-1a0a-4c9e-b85f-8553f3cc214c unbound from our chassis
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.590 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 04c40a40-1a0a-4c9e-b85f-8553f3cc214c
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.615 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[883fca5e-3fc7-48b8-ae90-59b165111644]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:02 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Oct 02 08:38:02 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001a.scope: Consumed 15.988s CPU time.
Oct 02 08:38:02 compute-0 systemd-machined[152597]: Machine qemu-20-instance-0000001a terminated.
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.659 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[882bd822-1af1-4512-9526-5afb7594457a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.664 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[e625ef24-7db6-44a8-9d5d-dd81cd2f1730]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.706 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[fe7cf57d-ac0c-417e-a961-3960c6e8ba7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.736 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[518f6345-89a6-4bd9-8552-d4d9e8a472d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04c40a40-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9a:61:79'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506845, 'reachable_time': 39398, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226331, 'error': None, 'target': 'ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.765 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8c1d30b9-2c10-4918-9a0f-4b4e6f76da6c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap04c40a40-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506860, 'tstamp': 506860}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226345, 'error': None, 'target': 'ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap04c40a40-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506865, 'tstamp': 506865}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226345, 'error': None, 'target': 'ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.766 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04c40a40-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.773 2 INFO nova.virt.libvirt.driver [-] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Instance destroyed successfully.
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.774 2 DEBUG nova.objects.instance [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lazy-loading 'resources' on Instance uuid b0fe2b2e-e7da-43d6-9db8-920adf3145fa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.776 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04c40a40-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.777 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.777 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap04c40a40-10, col_values=(('external_ids', {'iface-id': 'c4a7e4d0-d6dc-42e2-8605-1bf264bb43d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:38:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:02.777 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.793 2 DEBUG nova.virt.libvirt.vif [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:36:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-296676066',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-296676066',id=26,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:36:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e1769197ed0402d83b04ce749e85a94',ramdisk_id='',reservation_id='r-4sg1kfzm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-314916947',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-314916947-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:36:21Z,user_data=None,user_id='e5ba920e8a0e4b888ef2e7bde621cf10',uuid=b0fe2b2e-e7da-43d6-9db8-920adf3145fa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "address": "fa:16:3e:0f:19:c8", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44e773e5-eb", "ovs_interfaceid": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.794 2 DEBUG nova.network.os_vif_util [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Converting VIF {"id": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "address": "fa:16:3e:0f:19:c8", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44e773e5-eb", "ovs_interfaceid": "44e773e5-eb2f-4318-be7b-41cc770a4f02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.795 2 DEBUG nova.network.os_vif_util [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0f:19:c8,bridge_name='br-int',has_traffic_filtering=True,id=44e773e5-eb2f-4318-be7b-41cc770a4f02,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44e773e5-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.796 2 DEBUG os_vif [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0f:19:c8,bridge_name='br-int',has_traffic_filtering=True,id=44e773e5-eb2f-4318-be7b-41cc770a4f02,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44e773e5-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.800 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44e773e5-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.808 2 INFO os_vif [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0f:19:c8,bridge_name='br-int',has_traffic_filtering=True,id=44e773e5-eb2f-4318-be7b-41cc770a4f02,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44e773e5-eb')
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.809 2 INFO nova.virt.libvirt.driver [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Deleting instance files /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa_del
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.810 2 INFO nova.virt.libvirt.driver [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Deletion of /var/lib/nova/instances/b0fe2b2e-e7da-43d6-9db8-920adf3145fa_del complete
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.876 2 INFO nova.compute.manager [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Took 0.38 seconds to destroy the instance on the hypervisor.
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.877 2 DEBUG oslo.service.loopingcall [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.877 2 DEBUG nova.compute.manager [-] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:38:02 compute-0 nova_compute[192567]: 2025-10-02 08:38:02.877 2 DEBUG nova.network.neutron [-] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:38:03 compute-0 nova_compute[192567]: 2025-10-02 08:38:03.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:04 compute-0 nova_compute[192567]: 2025-10-02 08:38:04.698 2 DEBUG nova.compute.manager [req-1d7ca8ac-70ad-4a4f-a180-b78c656712f0 req-4d50c31b-939d-4a79-886d-709da2b63027 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Received event network-vif-unplugged-44e773e5-eb2f-4318-be7b-41cc770a4f02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:38:04 compute-0 nova_compute[192567]: 2025-10-02 08:38:04.699 2 DEBUG oslo_concurrency.lockutils [req-1d7ca8ac-70ad-4a4f-a180-b78c656712f0 req-4d50c31b-939d-4a79-886d-709da2b63027 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:04 compute-0 nova_compute[192567]: 2025-10-02 08:38:04.700 2 DEBUG oslo_concurrency.lockutils [req-1d7ca8ac-70ad-4a4f-a180-b78c656712f0 req-4d50c31b-939d-4a79-886d-709da2b63027 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:04 compute-0 nova_compute[192567]: 2025-10-02 08:38:04.700 2 DEBUG oslo_concurrency.lockutils [req-1d7ca8ac-70ad-4a4f-a180-b78c656712f0 req-4d50c31b-939d-4a79-886d-709da2b63027 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:04 compute-0 nova_compute[192567]: 2025-10-02 08:38:04.700 2 DEBUG nova.compute.manager [req-1d7ca8ac-70ad-4a4f-a180-b78c656712f0 req-4d50c31b-939d-4a79-886d-709da2b63027 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] No waiting events found dispatching network-vif-unplugged-44e773e5-eb2f-4318-be7b-41cc770a4f02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:38:04 compute-0 nova_compute[192567]: 2025-10-02 08:38:04.701 2 DEBUG nova.compute.manager [req-1d7ca8ac-70ad-4a4f-a180-b78c656712f0 req-4d50c31b-939d-4a79-886d-709da2b63027 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Received event network-vif-unplugged-44e773e5-eb2f-4318-be7b-41cc770a4f02 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:38:05 compute-0 podman[226350]: 2025-10-02 08:38:05.185776507 +0000 UTC m=+0.093129836 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:38:05 compute-0 nova_compute[192567]: 2025-10-02 08:38:05.768 2 DEBUG nova.network.neutron [-] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:38:05 compute-0 nova_compute[192567]: 2025-10-02 08:38:05.789 2 INFO nova.compute.manager [-] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Took 2.91 seconds to deallocate network for instance.
Oct 02 08:38:05 compute-0 nova_compute[192567]: 2025-10-02 08:38:05.845 2 DEBUG oslo_concurrency.lockutils [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:05 compute-0 nova_compute[192567]: 2025-10-02 08:38:05.846 2 DEBUG oslo_concurrency.lockutils [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:05 compute-0 nova_compute[192567]: 2025-10-02 08:38:05.864 2 DEBUG nova.compute.manager [req-be81438c-d0ad-4317-af09-16fc33314bbc req-cff6cc63-adf9-4bf7-b62f-10243aa8413b 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Received event network-vif-deleted-44e773e5-eb2f-4318-be7b-41cc770a4f02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:38:05 compute-0 nova_compute[192567]: 2025-10-02 08:38:05.938 2 DEBUG nova.compute.provider_tree [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:38:05 compute-0 nova_compute[192567]: 2025-10-02 08:38:05.953 2 DEBUG nova.scheduler.client.report [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:38:05 compute-0 nova_compute[192567]: 2025-10-02 08:38:05.976 2 DEBUG oslo_concurrency.lockutils [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:06 compute-0 nova_compute[192567]: 2025-10-02 08:38:06.019 2 INFO nova.scheduler.client.report [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Deleted allocations for instance b0fe2b2e-e7da-43d6-9db8-920adf3145fa
Oct 02 08:38:06 compute-0 nova_compute[192567]: 2025-10-02 08:38:06.096 2 DEBUG oslo_concurrency.lockutils [None req-4a76ca79-934f-4630-bcd3-cdd0a4fce2b0 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.610s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:06 compute-0 nova_compute[192567]: 2025-10-02 08:38:06.798 2 DEBUG nova.compute.manager [req-020e247e-2537-4d75-9b72-38ec8d187405 req-12c791b6-a442-40c4-adbb-17f9cfa23bc6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Received event network-vif-plugged-44e773e5-eb2f-4318-be7b-41cc770a4f02 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:38:06 compute-0 nova_compute[192567]: 2025-10-02 08:38:06.799 2 DEBUG oslo_concurrency.lockutils [req-020e247e-2537-4d75-9b72-38ec8d187405 req-12c791b6-a442-40c4-adbb-17f9cfa23bc6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:06 compute-0 nova_compute[192567]: 2025-10-02 08:38:06.799 2 DEBUG oslo_concurrency.lockutils [req-020e247e-2537-4d75-9b72-38ec8d187405 req-12c791b6-a442-40c4-adbb-17f9cfa23bc6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:06 compute-0 nova_compute[192567]: 2025-10-02 08:38:06.800 2 DEBUG oslo_concurrency.lockutils [req-020e247e-2537-4d75-9b72-38ec8d187405 req-12c791b6-a442-40c4-adbb-17f9cfa23bc6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "b0fe2b2e-e7da-43d6-9db8-920adf3145fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:06 compute-0 nova_compute[192567]: 2025-10-02 08:38:06.800 2 DEBUG nova.compute.manager [req-020e247e-2537-4d75-9b72-38ec8d187405 req-12c791b6-a442-40c4-adbb-17f9cfa23bc6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] No waiting events found dispatching network-vif-plugged-44e773e5-eb2f-4318-be7b-41cc770a4f02 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:38:06 compute-0 nova_compute[192567]: 2025-10-02 08:38:06.801 2 WARNING nova.compute.manager [req-020e247e-2537-4d75-9b72-38ec8d187405 req-12c791b6-a442-40c4-adbb-17f9cfa23bc6 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Received unexpected event network-vif-plugged-44e773e5-eb2f-4318-be7b-41cc770a4f02 for instance with vm_state deleted and task_state None.
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.453 2 DEBUG oslo_concurrency.lockutils [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.454 2 DEBUG oslo_concurrency.lockutils [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.454 2 DEBUG oslo_concurrency.lockutils [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.454 2 DEBUG oslo_concurrency.lockutils [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.455 2 DEBUG oslo_concurrency.lockutils [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.456 2 INFO nova.compute.manager [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Terminating instance
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.457 2 DEBUG nova.compute.manager [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:38:07 compute-0 kernel: tap95170647-1b (unregistering): left promiscuous mode
Oct 02 08:38:07 compute-0 NetworkManager[51654]: <info>  [1759394287.4829] device (tap95170647-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:07 compute-0 ovn_controller[94821]: 2025-10-02T08:38:07Z|00228|binding|INFO|Releasing lport 95170647-1bb0-483a-b37b-34ced5ac4541 from this chassis (sb_readonly=0)
Oct 02 08:38:07 compute-0 ovn_controller[94821]: 2025-10-02T08:38:07Z|00229|binding|INFO|Setting lport 95170647-1bb0-483a-b37b-34ced5ac4541 down in Southbound
Oct 02 08:38:07 compute-0 ovn_controller[94821]: 2025-10-02T08:38:07Z|00230|binding|INFO|Removing iface tap95170647-1b ovn-installed in OVS
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.506 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:da:e0 10.100.0.8'], port_security=['fa:16:3e:c1:da:e0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e1769197ed0402d83b04ce749e85a94', 'neutron:revision_number': '13', 'neutron:security_group_ids': '57446da9-02a4-4c71-8f97-35915eb59ad9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d33da5a5-f1cf-4e1d-b2d2-a87e57551306, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=95170647-1bb0-483a-b37b-34ced5ac4541) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.508 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 95170647-1bb0-483a-b37b-34ced5ac4541 in datapath 04c40a40-1a0a-4c9e-b85f-8553f3cc214c unbound from our chassis
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.510 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 04c40a40-1a0a-4c9e-b85f-8553f3cc214c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.511 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[7a3ddd50-38e0-432d-b055-d626a23c3fe8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.512 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c namespace which is not needed anymore
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:07 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000019.scope: Deactivated successfully.
Oct 02 08:38:07 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000019.scope: Consumed 2.313s CPU time.
Oct 02 08:38:07 compute-0 systemd-machined[152597]: Machine qemu-21-instance-00000019 terminated.
Oct 02 08:38:07 compute-0 neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c[225694]: [NOTICE]   (225698) : haproxy version is 2.8.14-c23fe91
Oct 02 08:38:07 compute-0 neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c[225694]: [NOTICE]   (225698) : path to executable is /usr/sbin/haproxy
Oct 02 08:38:07 compute-0 neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c[225694]: [WARNING]  (225698) : Exiting Master process...
Oct 02 08:38:07 compute-0 neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c[225694]: [WARNING]  (225698) : Exiting Master process...
Oct 02 08:38:07 compute-0 neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c[225694]: [ALERT]    (225698) : Current worker (225700) exited with code 143 (Terminated)
Oct 02 08:38:07 compute-0 neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c[225694]: [WARNING]  (225698) : All workers exited. Exiting... (0)
Oct 02 08:38:07 compute-0 systemd[1]: libpod-b63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588.scope: Deactivated successfully.
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.739 2 INFO nova.virt.libvirt.driver [-] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Instance destroyed successfully.
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.740 2 DEBUG nova.objects.instance [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lazy-loading 'resources' on Instance uuid 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:38:07 compute-0 podman[226399]: 2025-10-02 08:38:07.741104573 +0000 UTC m=+0.078710144 container died b63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.754 2 DEBUG nova.virt.libvirt.vif [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T08:35:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-293803348',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-293803348',id=25,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:36:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e1769197ed0402d83b04ce749e85a94',ramdisk_id='',reservation_id='r-wv8k5hv2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',clean_attempts='1',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-314916947',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-314916947-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:37:59Z,user_data=None,user_id='e5ba920e8a0e4b888ef2e7bde621cf10',uuid=1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "95170647-1bb0-483a-b37b-34ced5ac4541", "address": "fa:16:3e:c1:da:e0", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95170647-1b", "ovs_interfaceid": "95170647-1bb0-483a-b37b-34ced5ac4541", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.756 2 DEBUG nova.network.os_vif_util [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Converting VIF {"id": "95170647-1bb0-483a-b37b-34ced5ac4541", "address": "fa:16:3e:c1:da:e0", "network": {"id": "04c40a40-1a0a-4c9e-b85f-8553f3cc214c", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1776356709-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8dce8e38e4af4207842a33945f574aa2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95170647-1b", "ovs_interfaceid": "95170647-1bb0-483a-b37b-34ced5ac4541", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.757 2 DEBUG nova.network.os_vif_util [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:da:e0,bridge_name='br-int',has_traffic_filtering=True,id=95170647-1bb0-483a-b37b-34ced5ac4541,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95170647-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.758 2 DEBUG os_vif [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:da:e0,bridge_name='br-int',has_traffic_filtering=True,id=95170647-1bb0-483a-b37b-34ced5ac4541,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95170647-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.762 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap95170647-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.813 2 INFO os_vif [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:da:e0,bridge_name='br-int',has_traffic_filtering=True,id=95170647-1bb0-483a-b37b-34ced5ac4541,network=Network(04c40a40-1a0a-4c9e-b85f-8553f3cc214c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95170647-1b')
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.813 2 INFO nova.virt.libvirt.driver [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Deleting instance files /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51_del
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.814 2 INFO nova.virt.libvirt.driver [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Deletion of /var/lib/nova/instances/1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51_del complete
Oct 02 08:38:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588-userdata-shm.mount: Deactivated successfully.
Oct 02 08:38:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-eb930d9235ec845fc214b2d790053d2d094993269cd27cbda1a6e33edf2e8083-merged.mount: Deactivated successfully.
Oct 02 08:38:07 compute-0 podman[226399]: 2025-10-02 08:38:07.825348319 +0000 UTC m=+0.162953900 container cleanup b63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 02 08:38:07 compute-0 systemd[1]: libpod-conmon-b63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588.scope: Deactivated successfully.
Oct 02 08:38:07 compute-0 podman[226441]: 2025-10-02 08:38:07.885993905 +0000 UTC m=+0.041071151 container remove b63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.892 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[c91c9a69-4482-4313-b89f-a16748516963]: (4, ('Thu Oct  2 08:38:07 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c (b63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588)\nb63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588\nThu Oct  2 08:38:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c (b63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588)\nb63fa26f86c98b720b767c50a2835839275fc6b2ddceeeac39728e30fc265588\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.894 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f76eaeaf-140d-432a-8744-886df3c0df28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.895 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04c40a40-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:07 compute-0 kernel: tap04c40a40-10: left promiscuous mode
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.902 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[82e2eeb9-2a3a-4076-8891-1a2792d0ac42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.934 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[89c3619b-c7ee-4757-b9c3-f99a4b3d6218]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.935 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e1bf9c8f-1c4a-48ec-a9c3-5a50e22a76ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.962 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[a306f589-ddd7-408a-83fb-f57f794cdd1e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506836, 'reachable_time': 43921, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226456, 'error': None, 'target': 'ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.967 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-04c40a40-1a0a-4c9e-b85f-8553f3cc214c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:38:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:07.967 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[0220e09c-23e8-423a-926b-ffdc60dc3a0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:38:07 compute-0 systemd[1]: run-netns-ovnmeta\x2d04c40a40\x2d1a0a\x2d4c9e\x2db85f\x2d8553f3cc214c.mount: Deactivated successfully.
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.973 2 DEBUG nova.compute.manager [req-15b6a79d-4594-4bf4-8d05-c0bd1846a226 req-b4bf4b32-5375-4381-b28a-e24d3c9188bd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Received event network-vif-unplugged-95170647-1bb0-483a-b37b-34ced5ac4541 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.974 2 DEBUG oslo_concurrency.lockutils [req-15b6a79d-4594-4bf4-8d05-c0bd1846a226 req-b4bf4b32-5375-4381-b28a-e24d3c9188bd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.975 2 DEBUG oslo_concurrency.lockutils [req-15b6a79d-4594-4bf4-8d05-c0bd1846a226 req-b4bf4b32-5375-4381-b28a-e24d3c9188bd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.976 2 DEBUG oslo_concurrency.lockutils [req-15b6a79d-4594-4bf4-8d05-c0bd1846a226 req-b4bf4b32-5375-4381-b28a-e24d3c9188bd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.977 2 DEBUG nova.compute.manager [req-15b6a79d-4594-4bf4-8d05-c0bd1846a226 req-b4bf4b32-5375-4381-b28a-e24d3c9188bd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] No waiting events found dispatching network-vif-unplugged-95170647-1bb0-483a-b37b-34ced5ac4541 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.977 2 DEBUG nova.compute.manager [req-15b6a79d-4594-4bf4-8d05-c0bd1846a226 req-b4bf4b32-5375-4381-b28a-e24d3c9188bd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Received event network-vif-unplugged-95170647-1bb0-483a-b37b-34ced5ac4541 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.977 2 DEBUG nova.compute.manager [req-15b6a79d-4594-4bf4-8d05-c0bd1846a226 req-b4bf4b32-5375-4381-b28a-e24d3c9188bd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Received event network-vif-plugged-95170647-1bb0-483a-b37b-34ced5ac4541 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.978 2 DEBUG oslo_concurrency.lockutils [req-15b6a79d-4594-4bf4-8d05-c0bd1846a226 req-b4bf4b32-5375-4381-b28a-e24d3c9188bd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.978 2 DEBUG oslo_concurrency.lockutils [req-15b6a79d-4594-4bf4-8d05-c0bd1846a226 req-b4bf4b32-5375-4381-b28a-e24d3c9188bd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.979 2 DEBUG oslo_concurrency.lockutils [req-15b6a79d-4594-4bf4-8d05-c0bd1846a226 req-b4bf4b32-5375-4381-b28a-e24d3c9188bd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.979 2 DEBUG nova.compute.manager [req-15b6a79d-4594-4bf4-8d05-c0bd1846a226 req-b4bf4b32-5375-4381-b28a-e24d3c9188bd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] No waiting events found dispatching network-vif-plugged-95170647-1bb0-483a-b37b-34ced5ac4541 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.981 2 WARNING nova.compute.manager [req-15b6a79d-4594-4bf4-8d05-c0bd1846a226 req-b4bf4b32-5375-4381-b28a-e24d3c9188bd 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Received unexpected event network-vif-plugged-95170647-1bb0-483a-b37b-34ced5ac4541 for instance with vm_state active and task_state deleting.
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.997 2 INFO nova.compute.manager [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Took 0.54 seconds to destroy the instance on the hypervisor.
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.998 2 DEBUG oslo.service.loopingcall [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:38:07 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.999 2 DEBUG nova.compute.manager [-] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:38:08 compute-0 nova_compute[192567]: 2025-10-02 08:38:07.999 2 DEBUG nova.network.neutron [-] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:38:08 compute-0 nova_compute[192567]: 2025-10-02 08:38:08.509 2 DEBUG nova.network.neutron [-] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:38:08 compute-0 nova_compute[192567]: 2025-10-02 08:38:08.530 2 INFO nova.compute.manager [-] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Took 0.53 seconds to deallocate network for instance.
Oct 02 08:38:08 compute-0 nova_compute[192567]: 2025-10-02 08:38:08.578 2 DEBUG oslo_concurrency.lockutils [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:08 compute-0 nova_compute[192567]: 2025-10-02 08:38:08.578 2 DEBUG oslo_concurrency.lockutils [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:08 compute-0 nova_compute[192567]: 2025-10-02 08:38:08.583 2 DEBUG oslo_concurrency.lockutils [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:08 compute-0 nova_compute[192567]: 2025-10-02 08:38:08.612 2 INFO nova.scheduler.client.report [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Deleted allocations for instance 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51
Oct 02 08:38:08 compute-0 nova_compute[192567]: 2025-10-02 08:38:08.709 2 DEBUG oslo_concurrency.lockutils [None req-a4f44907-1306-488b-8216-4b3dcecfd8e4 e5ba920e8a0e4b888ef2e7bde621cf10 7e1769197ed0402d83b04ce749e85a94 - - default default] Lock "1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:08 compute-0 nova_compute[192567]: 2025-10-02 08:38:08.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:10 compute-0 nova_compute[192567]: 2025-10-02 08:38:10.111 2 DEBUG nova.compute.manager [req-4ffb84e1-82f5-4c55-b75e-c1b0185d60ea req-72aa1b84-56a4-452e-b0ba-3a555867978e 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Received event network-vif-deleted-95170647-1bb0-483a-b37b-34ced5ac4541 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:38:12 compute-0 nova_compute[192567]: 2025-10-02 08:38:12.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:13 compute-0 nova_compute[192567]: 2025-10-02 08:38:13.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:16 compute-0 podman[226457]: 2025-10-02 08:38:16.192101573 +0000 UTC m=+0.097158213 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, maintainer=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:38:17 compute-0 nova_compute[192567]: 2025-10-02 08:38:17.771 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759394282.7696419, b0fe2b2e-e7da-43d6-9db8-920adf3145fa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:38:17 compute-0 nova_compute[192567]: 2025-10-02 08:38:17.772 2 INFO nova.compute.manager [-] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] VM Stopped (Lifecycle Event)
Oct 02 08:38:17 compute-0 nova_compute[192567]: 2025-10-02 08:38:17.795 2 DEBUG nova.compute.manager [None req-4bfade61-1b07-4f3f-b21a-73e79d8b9554 - - - - - -] [instance: b0fe2b2e-e7da-43d6-9db8-920adf3145fa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:38:17 compute-0 nova_compute[192567]: 2025-10-02 08:38:17.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:19 compute-0 nova_compute[192567]: 2025-10-02 08:38:19.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:22 compute-0 nova_compute[192567]: 2025-10-02 08:38:22.736 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759394287.7360861, 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:38:22 compute-0 nova_compute[192567]: 2025-10-02 08:38:22.737 2 INFO nova.compute.manager [-] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] VM Stopped (Lifecycle Event)
Oct 02 08:38:22 compute-0 nova_compute[192567]: 2025-10-02 08:38:22.775 2 DEBUG nova.compute.manager [None req-fff88ef1-dfe6-48a8-8d5f-1e4e9d12f456 - - - - - -] [instance: 1b9b4517-fa2a-446c-8cd8-7ffc50dc8d51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:38:22 compute-0 nova_compute[192567]: 2025-10-02 08:38:22.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:24 compute-0 nova_compute[192567]: 2025-10-02 08:38:24.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:27 compute-0 nova_compute[192567]: 2025-10-02 08:38:27.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:29 compute-0 nova_compute[192567]: 2025-10-02 08:38:29.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:29 compute-0 podman[203011]: time="2025-10-02T08:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:38:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:38:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 02 08:38:30 compute-0 nova_compute[192567]: 2025-10-02 08:38:30.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:31 compute-0 podman[226479]: 2025-10-02 08:38:31.160253024 +0000 UTC m=+0.076756103 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 02 08:38:31 compute-0 podman[226482]: 2025-10-02 08:38:31.189967757 +0000 UTC m=+0.089206293 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:38:31 compute-0 podman[226481]: 2025-10-02 08:38:31.198417053 +0000 UTC m=+0.106933080 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct 02 08:38:31 compute-0 podman[226480]: 2025-10-02 08:38:31.214982954 +0000 UTC m=+0.119081943 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct 02 08:38:31 compute-0 openstack_network_exporter[205118]: ERROR   08:38:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:38:31 compute-0 openstack_network_exporter[205118]: ERROR   08:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:38:31 compute-0 openstack_network_exporter[205118]: ERROR   08:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:38:31 compute-0 openstack_network_exporter[205118]: ERROR   08:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:38:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:38:31 compute-0 openstack_network_exporter[205118]: ERROR   08:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:38:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:38:32 compute-0 nova_compute[192567]: 2025-10-02 08:38:32.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:34 compute-0 nova_compute[192567]: 2025-10-02 08:38:34.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:36 compute-0 podman[226558]: 2025-10-02 08:38:36.190765897 +0000 UTC m=+0.095706669 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:38:37 compute-0 nova_compute[192567]: 2025-10-02 08:38:37.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:39 compute-0 nova_compute[192567]: 2025-10-02 08:38:39.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:42 compute-0 nova_compute[192567]: 2025-10-02 08:38:42.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:44 compute-0 nova_compute[192567]: 2025-10-02 08:38:44.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:46.002 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:46.003 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:46.003 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:47 compute-0 podman[226580]: 2025-10-02 08:38:47.19545253 +0000 UTC m=+0.092654801 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41)
Oct 02 08:38:47 compute-0 nova_compute[192567]: 2025-10-02 08:38:47.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:38:47 compute-0 nova_compute[192567]: 2025-10-02 08:38:47.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:49 compute-0 nova_compute[192567]: 2025-10-02 08:38:49.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:49 compute-0 nova_compute[192567]: 2025-10-02 08:38:49.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:38:49 compute-0 nova_compute[192567]: 2025-10-02 08:38:49.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:38:49 compute-0 nova_compute[192567]: 2025-10-02 08:38:49.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:38:49 compute-0 nova_compute[192567]: 2025-10-02 08:38:49.654 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:38:50 compute-0 nova_compute[192567]: 2025-10-02 08:38:50.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:38:50 compute-0 nova_compute[192567]: 2025-10-02 08:38:50.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:38:51 compute-0 nova_compute[192567]: 2025-10-02 08:38:51.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:38:51 compute-0 nova_compute[192567]: 2025-10-02 08:38:51.661 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:51 compute-0 nova_compute[192567]: 2025-10-02 08:38:51.661 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:51 compute-0 nova_compute[192567]: 2025-10-02 08:38:51.661 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:51 compute-0 nova_compute[192567]: 2025-10-02 08:38:51.661 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:38:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:51.833 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:38:51 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:38:51.834 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:38:51 compute-0 nova_compute[192567]: 2025-10-02 08:38:51.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:51 compute-0 nova_compute[192567]: 2025-10-02 08:38:51.876 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:38:51 compute-0 nova_compute[192567]: 2025-10-02 08:38:51.877 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5865MB free_disk=73.46514129638672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:38:51 compute-0 nova_compute[192567]: 2025-10-02 08:38:51.878 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:38:51 compute-0 nova_compute[192567]: 2025-10-02 08:38:51.878 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:38:52 compute-0 nova_compute[192567]: 2025-10-02 08:38:52.133 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:38:52 compute-0 nova_compute[192567]: 2025-10-02 08:38:52.134 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:38:52 compute-0 nova_compute[192567]: 2025-10-02 08:38:52.157 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:38:52 compute-0 nova_compute[192567]: 2025-10-02 08:38:52.173 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:38:52 compute-0 nova_compute[192567]: 2025-10-02 08:38:52.204 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:38:52 compute-0 nova_compute[192567]: 2025-10-02 08:38:52.205 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:38:52 compute-0 nova_compute[192567]: 2025-10-02 08:38:52.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:54 compute-0 nova_compute[192567]: 2025-10-02 08:38:54.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:54 compute-0 nova_compute[192567]: 2025-10-02 08:38:54.206 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:38:54 compute-0 nova_compute[192567]: 2025-10-02 08:38:54.207 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:38:56 compute-0 nova_compute[192567]: 2025-10-02 08:38:56.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:38:56 compute-0 nova_compute[192567]: 2025-10-02 08:38:56.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:38:57 compute-0 nova_compute[192567]: 2025-10-02 08:38:57.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:59 compute-0 nova_compute[192567]: 2025-10-02 08:38:59.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:38:59 compute-0 podman[203011]: time="2025-10-02T08:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:38:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:38:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Oct 02 08:39:01 compute-0 openstack_network_exporter[205118]: ERROR   08:39:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:39:01 compute-0 openstack_network_exporter[205118]: ERROR   08:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:39:01 compute-0 openstack_network_exporter[205118]: ERROR   08:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:39:01 compute-0 openstack_network_exporter[205118]: ERROR   08:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:39:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:39:01 compute-0 openstack_network_exporter[205118]: ERROR   08:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:39:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:39:01 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:39:01.837 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:39:02 compute-0 podman[226603]: 2025-10-02 08:39:02.165696943 +0000 UTC m=+0.071590780 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 02 08:39:02 compute-0 podman[226606]: 2025-10-02 08:39:02.185927748 +0000 UTC m=+0.070825436 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3)
Oct 02 08:39:02 compute-0 podman[226605]: 2025-10-02 08:39:02.194624092 +0000 UTC m=+0.086086756 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:39:02 compute-0 podman[226604]: 2025-10-02 08:39:02.216927842 +0000 UTC m=+0.112314159 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Oct 02 08:39:02 compute-0 nova_compute[192567]: 2025-10-02 08:39:02.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:39:02 compute-0 nova_compute[192567]: 2025-10-02 08:39:02.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:04 compute-0 nova_compute[192567]: 2025-10-02 08:39:04.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:06 compute-0 ovn_controller[94821]: 2025-10-02T08:39:06Z|00231|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Oct 02 08:39:07 compute-0 podman[226684]: 2025-10-02 08:39:07.200515871 +0000 UTC m=+0.106183297 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:39:07 compute-0 nova_compute[192567]: 2025-10-02 08:39:07.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:09 compute-0 nova_compute[192567]: 2025-10-02 08:39:09.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:12 compute-0 nova_compute[192567]: 2025-10-02 08:39:12.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:14 compute-0 nova_compute[192567]: 2025-10-02 08:39:14.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:17 compute-0 nova_compute[192567]: 2025-10-02 08:39:17.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:18 compute-0 podman[226708]: 2025-10-02 08:39:18.162557754 +0000 UTC m=+0.071379954 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 02 08:39:19 compute-0 nova_compute[192567]: 2025-10-02 08:39:19.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:22 compute-0 nova_compute[192567]: 2025-10-02 08:39:22.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:24 compute-0 nova_compute[192567]: 2025-10-02 08:39:24.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:27 compute-0 nova_compute[192567]: 2025-10-02 08:39:27.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:29 compute-0 nova_compute[192567]: 2025-10-02 08:39:29.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:29 compute-0 podman[203011]: time="2025-10-02T08:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:39:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:39:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Oct 02 08:39:31 compute-0 openstack_network_exporter[205118]: ERROR   08:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:39:31 compute-0 openstack_network_exporter[205118]: ERROR   08:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:39:31 compute-0 openstack_network_exporter[205118]: ERROR   08:39:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:39:31 compute-0 openstack_network_exporter[205118]: ERROR   08:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:39:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:39:31 compute-0 openstack_network_exporter[205118]: ERROR   08:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:39:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:39:32 compute-0 nova_compute[192567]: 2025-10-02 08:39:32.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:33 compute-0 podman[226733]: 2025-10-02 08:39:33.150913998 +0000 UTC m=+0.052630995 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 02 08:39:33 compute-0 podman[226731]: 2025-10-02 08:39:33.185967419 +0000 UTC m=+0.082419981 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:39:33 compute-0 podman[226732]: 2025-10-02 08:39:33.191669088 +0000 UTC m=+0.096229945 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:39:33 compute-0 podman[226734]: 2025-10-02 08:39:33.198379199 +0000 UTC m=+0.084793275 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 02 08:39:34 compute-0 nova_compute[192567]: 2025-10-02 08:39:34.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:37 compute-0 nova_compute[192567]: 2025-10-02 08:39:37.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:38 compute-0 podman[226812]: 2025-10-02 08:39:38.188605435 +0000 UTC m=+0.092482546 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:39:39 compute-0 nova_compute[192567]: 2025-10-02 08:39:39.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:42 compute-0 nova_compute[192567]: 2025-10-02 08:39:42.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:44 compute-0 nova_compute[192567]: 2025-10-02 08:39:44.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:39:46.003 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:39:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:39:46.003 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:39:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:39:46.003 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:39:47 compute-0 nova_compute[192567]: 2025-10-02 08:39:47.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:48 compute-0 nova_compute[192567]: 2025-10-02 08:39:48.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:39:49 compute-0 podman[226838]: 2025-10-02 08:39:49.173938321 +0000 UTC m=+0.085329253 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public)
Oct 02 08:39:49 compute-0 nova_compute[192567]: 2025-10-02 08:39:49.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:49 compute-0 nova_compute[192567]: 2025-10-02 08:39:49.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:39:49 compute-0 nova_compute[192567]: 2025-10-02 08:39:49.623 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:39:49 compute-0 nova_compute[192567]: 2025-10-02 08:39:49.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:39:49 compute-0 nova_compute[192567]: 2025-10-02 08:39:49.652 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.651 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.652 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.652 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.652 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.816 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.817 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5884MB free_disk=73.46198654174805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.817 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.817 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.898 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.899 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.920 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.934 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.936 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:39:51 compute-0 nova_compute[192567]: 2025-10-02 08:39:51.936 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:39:52 compute-0 nova_compute[192567]: 2025-10-02 08:39:52.937 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:39:52 compute-0 nova_compute[192567]: 2025-10-02 08:39:52.938 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:39:53 compute-0 nova_compute[192567]: 2025-10-02 08:39:53.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:53 compute-0 nova_compute[192567]: 2025-10-02 08:39:53.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:39:54 compute-0 nova_compute[192567]: 2025-10-02 08:39:54.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:54 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 02 08:39:54 compute-0 nova_compute[192567]: 2025-10-02 08:39:54.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:39:55 compute-0 nova_compute[192567]: 2025-10-02 08:39:55.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:39:58 compute-0 nova_compute[192567]: 2025-10-02 08:39:58.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:58 compute-0 nova_compute[192567]: 2025-10-02 08:39:58.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:39:58 compute-0 nova_compute[192567]: 2025-10-02 08:39:58.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:39:59 compute-0 nova_compute[192567]: 2025-10-02 08:39:59.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:39:59 compute-0 podman[203011]: time="2025-10-02T08:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:39:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:39:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 02 08:40:01 compute-0 openstack_network_exporter[205118]: ERROR   08:40:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:40:01 compute-0 openstack_network_exporter[205118]: ERROR   08:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:40:01 compute-0 openstack_network_exporter[205118]: ERROR   08:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:40:01 compute-0 openstack_network_exporter[205118]: ERROR   08:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:40:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:40:01 compute-0 openstack_network_exporter[205118]: ERROR   08:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:40:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:40:02 compute-0 nova_compute[192567]: 2025-10-02 08:40:02.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:40:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:02.863 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:40:02 compute-0 nova_compute[192567]: 2025-10-02 08:40:02.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:02.865 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:40:03 compute-0 nova_compute[192567]: 2025-10-02 08:40:03.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:04 compute-0 podman[226864]: 2025-10-02 08:40:04.159474667 +0000 UTC m=+0.060657346 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:40:04 compute-0 podman[226861]: 2025-10-02 08:40:04.167888232 +0000 UTC m=+0.078085874 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:40:04 compute-0 podman[226863]: 2025-10-02 08:40:04.179122405 +0000 UTC m=+0.078672323 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 02 08:40:04 compute-0 podman[226862]: 2025-10-02 08:40:04.206374851 +0000 UTC m=+0.111262036 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Oct 02 08:40:04 compute-0 nova_compute[192567]: 2025-10-02 08:40:04.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:04 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:04.868 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:40:08 compute-0 nova_compute[192567]: 2025-10-02 08:40:08.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:09 compute-0 podman[226941]: 2025-10-02 08:40:09.143973603 +0000 UTC m=+0.061159432 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:40:09 compute-0 nova_compute[192567]: 2025-10-02 08:40:09.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:13 compute-0 nova_compute[192567]: 2025-10-02 08:40:13.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:13 compute-0 unix_chkpwd[226970]: password check failed for user (root)
Oct 02 08:40:13 compute-0 sshd-session[226968]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.10.225  user=root
Oct 02 08:40:14 compute-0 nova_compute[192567]: 2025-10-02 08:40:14.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:15 compute-0 sshd-session[226968]: Failed password for root from 141.98.10.225 port 23834 ssh2
Oct 02 08:40:17 compute-0 unix_chkpwd[226971]: password check failed for user (root)
Oct 02 08:40:18 compute-0 nova_compute[192567]: 2025-10-02 08:40:18.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:18 compute-0 sshd-session[226968]: Failed password for root from 141.98.10.225 port 23834 ssh2
Oct 02 08:40:19 compute-0 nova_compute[192567]: 2025-10-02 08:40:19.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:19 compute-0 unix_chkpwd[226972]: password check failed for user (root)
Oct 02 08:40:20 compute-0 podman[226973]: 2025-10-02 08:40:20.192531307 +0000 UTC m=+0.098393473 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Oct 02 08:40:20 compute-0 sshd-session[226968]: Failed password for root from 141.98.10.225 port 23834 ssh2
Oct 02 08:40:21 compute-0 sshd-session[226968]: Received disconnect from 141.98.10.225 port 23834:11:  [preauth]
Oct 02 08:40:21 compute-0 sshd-session[226968]: Disconnected from authenticating user root 141.98.10.225 port 23834 [preauth]
Oct 02 08:40:21 compute-0 sshd-session[226968]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.10.225  user=root
Oct 02 08:40:22 compute-0 unix_chkpwd[226996]: password check failed for user (root)
Oct 02 08:40:22 compute-0 sshd-session[226994]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.10.225  user=root
Oct 02 08:40:23 compute-0 nova_compute[192567]: 2025-10-02 08:40:23.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:24 compute-0 nova_compute[192567]: 2025-10-02 08:40:24.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:24 compute-0 sshd-session[226994]: Failed password for root from 141.98.10.225 port 23952 ssh2
Oct 02 08:40:26 compute-0 unix_chkpwd[226997]: password check failed for user (root)
Oct 02 08:40:28 compute-0 nova_compute[192567]: 2025-10-02 08:40:28.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:28 compute-0 sshd-session[226994]: Failed password for root from 141.98.10.225 port 23952 ssh2
Oct 02 08:40:28 compute-0 unix_chkpwd[226998]: password check failed for user (root)
Oct 02 08:40:29 compute-0 nova_compute[192567]: 2025-10-02 08:40:29.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:29 compute-0 podman[203011]: time="2025-10-02T08:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:40:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:40:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3013 "" "Go-http-client/1.1"
Oct 02 08:40:30 compute-0 sshd-session[226994]: Failed password for root from 141.98.10.225 port 23952 ssh2
Oct 02 08:40:31 compute-0 openstack_network_exporter[205118]: ERROR   08:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:40:31 compute-0 openstack_network_exporter[205118]: ERROR   08:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:40:31 compute-0 openstack_network_exporter[205118]: ERROR   08:40:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:40:31 compute-0 openstack_network_exporter[205118]: ERROR   08:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:40:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:40:31 compute-0 openstack_network_exporter[205118]: ERROR   08:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:40:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:40:32 compute-0 sshd-session[226994]: Received disconnect from 141.98.10.225 port 23952:11:  [preauth]
Oct 02 08:40:32 compute-0 sshd-session[226994]: Disconnected from authenticating user root 141.98.10.225 port 23952 [preauth]
Oct 02 08:40:32 compute-0 sshd-session[226994]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.10.225  user=root
Oct 02 08:40:33 compute-0 nova_compute[192567]: 2025-10-02 08:40:33.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:33 compute-0 unix_chkpwd[227001]: password check failed for user (root)
Oct 02 08:40:33 compute-0 sshd-session[226999]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.10.225  user=root
Oct 02 08:40:34 compute-0 nova_compute[192567]: 2025-10-02 08:40:34.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:35 compute-0 podman[227002]: 2025-10-02 08:40:35.208666502 +0000 UTC m=+0.101950494 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 02 08:40:35 compute-0 podman[227004]: 2025-10-02 08:40:35.209278721 +0000 UTC m=+0.090235686 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct 02 08:40:35 compute-0 podman[227005]: 2025-10-02 08:40:35.23345865 +0000 UTC m=+0.111690349 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 02 08:40:35 compute-0 podman[227003]: 2025-10-02 08:40:35.24937233 +0000 UTC m=+0.130776479 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:40:35 compute-0 sshd-session[226999]: Failed password for root from 141.98.10.225 port 46842 ssh2
Oct 02 08:40:37 compute-0 unix_chkpwd[227082]: password check failed for user (root)
Oct 02 08:40:38 compute-0 nova_compute[192567]: 2025-10-02 08:40:38.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:38 compute-0 nova_compute[192567]: 2025-10-02 08:40:38.569 2 DEBUG nova.virt.libvirt.driver [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Creating tmpfile /var/lib/nova/instances/tmp0w8lt7ph to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Oct 02 08:40:38 compute-0 nova_compute[192567]: 2025-10-02 08:40:38.570 2 DEBUG nova.compute.manager [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0w8lt7ph',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Oct 02 08:40:39 compute-0 nova_compute[192567]: 2025-10-02 08:40:39.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:39 compute-0 sshd-session[226999]: Failed password for root from 141.98.10.225 port 46842 ssh2
Oct 02 08:40:40 compute-0 podman[227083]: 2025-10-02 08:40:40.135216569 +0000 UTC m=+0.058998204 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:40:40 compute-0 nova_compute[192567]: 2025-10-02 08:40:40.217 2 DEBUG nova.compute.manager [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0w8lt7ph',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='75e90fab-314f-4903-bec1-6446ea4ad7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Oct 02 08:40:40 compute-0 nova_compute[192567]: 2025-10-02 08:40:40.250 2 DEBUG oslo_concurrency.lockutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-75e90fab-314f-4903-bec1-6446ea4ad7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:40:40 compute-0 nova_compute[192567]: 2025-10-02 08:40:40.251 2 DEBUG oslo_concurrency.lockutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-75e90fab-314f-4903-bec1-6446ea4ad7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:40:40 compute-0 nova_compute[192567]: 2025-10-02 08:40:40.251 2 DEBUG nova.network.neutron [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.337 2 DEBUG nova.network.neutron [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Updating instance_info_cache with network_info: [{"id": "49e98fa1-221b-4416-a850-f14fd001fc00", "address": "fa:16:3e:a2:ac:11", "network": {"id": "42a09407-34b6-42b5-8fee-510c4d23f792", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-29438232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e0d4bcf8c1c401bb76039b2d2845a9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49e98fa1-22", "ovs_interfaceid": "49e98fa1-221b-4416-a850-f14fd001fc00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.356 2 DEBUG oslo_concurrency.lockutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-75e90fab-314f-4903-bec1-6446ea4ad7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.357 2 DEBUG nova.virt.libvirt.driver [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0w8lt7ph',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='75e90fab-314f-4903-bec1-6446ea4ad7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.358 2 DEBUG nova.virt.libvirt.driver [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Creating instance directory: /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.358 2 DEBUG nova.virt.libvirt.driver [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Creating disk.info with the contents: {'/var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk': 'qcow2', '/var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.359 2 DEBUG nova.virt.libvirt.driver [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.359 2 DEBUG nova.objects.instance [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 75e90fab-314f-4903-bec1-6446ea4ad7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.387 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.444 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.445 2 DEBUG oslo_concurrency.lockutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.445 2 DEBUG oslo_concurrency.lockutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.457 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.520 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.521 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.578 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.580 2 DEBUG oslo_concurrency.lockutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.581 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.663 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.665 2 DEBUG nova.virt.disk.api [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.666 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.731 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.732 2 DEBUG nova.virt.disk.api [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.733 2 DEBUG nova.objects.instance [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 75e90fab-314f-4903-bec1-6446ea4ad7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.761 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.804 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk.config 485376" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.806 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk.config to /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Oct 02 08:40:41 compute-0 nova_compute[192567]: 2025-10-02 08:40:41.807 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk.config /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:40:42 compute-0 unix_chkpwd[227128]: password check failed for user (root)
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.376 2 DEBUG oslo_concurrency.processutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk.config /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed" returned: 0 in 0.569s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.377 2 DEBUG nova.virt.libvirt.driver [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.379 2 DEBUG nova.virt.libvirt.vif [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:39:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-2095835694',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-2095835694',id=30,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:40:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cfed6615d64e404ab1542b20621438d9',ramdisk_id='',reservation_id='r-u3yn0w0c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-2031848124',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-2031848124-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:40:03Z,user_data=None,user_id='ab2d5dc08c96417b93ba3fc03cddf0cd',uuid=75e90fab-314f-4903-bec1-6446ea4ad7ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "49e98fa1-221b-4416-a850-f14fd001fc00", "address": "fa:16:3e:a2:ac:11", "network": {"id": "42a09407-34b6-42b5-8fee-510c4d23f792", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-29438232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e0d4bcf8c1c401bb76039b2d2845a9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap49e98fa1-22", "ovs_interfaceid": "49e98fa1-221b-4416-a850-f14fd001fc00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.380 2 DEBUG nova.network.os_vif_util [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "49e98fa1-221b-4416-a850-f14fd001fc00", "address": "fa:16:3e:a2:ac:11", "network": {"id": "42a09407-34b6-42b5-8fee-510c4d23f792", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-29438232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e0d4bcf8c1c401bb76039b2d2845a9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap49e98fa1-22", "ovs_interfaceid": "49e98fa1-221b-4416-a850-f14fd001fc00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.382 2 DEBUG nova.network.os_vif_util [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:ac:11,bridge_name='br-int',has_traffic_filtering=True,id=49e98fa1-221b-4416-a850-f14fd001fc00,network=Network(42a09407-34b6-42b5-8fee-510c4d23f792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49e98fa1-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.382 2 DEBUG os_vif [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:ac:11,bridge_name='br-int',has_traffic_filtering=True,id=49e98fa1-221b-4416-a850-f14fd001fc00,network=Network(42a09407-34b6-42b5-8fee-510c4d23f792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49e98fa1-22') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.385 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.386 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.393 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap49e98fa1-22, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.394 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap49e98fa1-22, col_values=(('external_ids', {'iface-id': '49e98fa1-221b-4416-a850-f14fd001fc00', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:ac:11', 'vm-uuid': '75e90fab-314f-4903-bec1-6446ea4ad7ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:42 compute-0 NetworkManager[51654]: <info>  [1759394442.4459] manager: (tap49e98fa1-22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.456 2 INFO os_vif [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:ac:11,bridge_name='br-int',has_traffic_filtering=True,id=49e98fa1-221b-4416-a850-f14fd001fc00,network=Network(42a09407-34b6-42b5-8fee-510c4d23f792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49e98fa1-22')
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.457 2 DEBUG nova.virt.libvirt.driver [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Oct 02 08:40:42 compute-0 nova_compute[192567]: 2025-10-02 08:40:42.457 2 DEBUG nova.compute.manager [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0w8lt7ph',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='75e90fab-314f-4903-bec1-6446ea4ad7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Oct 02 08:40:43 compute-0 nova_compute[192567]: 2025-10-02 08:40:43.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:43 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:43.064 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:40:43 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:43.067 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:40:43 compute-0 nova_compute[192567]: 2025-10-02 08:40:43.314 2 DEBUG nova.network.neutron [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Port 49e98fa1-221b-4416-a850-f14fd001fc00 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Oct 02 08:40:43 compute-0 nova_compute[192567]: 2025-10-02 08:40:43.316 2 DEBUG nova.compute.manager [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0w8lt7ph',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='75e90fab-314f-4903-bec1-6446ea4ad7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Oct 02 08:40:43 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 02 08:40:43 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 02 08:40:43 compute-0 kernel: tap49e98fa1-22: entered promiscuous mode
Oct 02 08:40:43 compute-0 NetworkManager[51654]: <info>  [1759394443.7498] manager: (tap49e98fa1-22): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Oct 02 08:40:43 compute-0 nova_compute[192567]: 2025-10-02 08:40:43.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:43 compute-0 ovn_controller[94821]: 2025-10-02T08:40:43Z|00232|binding|INFO|Claiming lport 49e98fa1-221b-4416-a850-f14fd001fc00 for this additional chassis.
Oct 02 08:40:43 compute-0 ovn_controller[94821]: 2025-10-02T08:40:43Z|00233|binding|INFO|49e98fa1-221b-4416-a850-f14fd001fc00: Claiming fa:16:3e:a2:ac:11 10.100.0.9
Oct 02 08:40:43 compute-0 nova_compute[192567]: 2025-10-02 08:40:43.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:43 compute-0 sshd-session[226999]: Failed password for root from 141.98.10.225 port 46842 ssh2
Oct 02 08:40:43 compute-0 systemd-udevd[227162]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:40:43 compute-0 systemd-machined[152597]: New machine qemu-22-instance-0000001e.
Oct 02 08:40:43 compute-0 NetworkManager[51654]: <info>  [1759394443.8609] device (tap49e98fa1-22): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:40:43 compute-0 NetworkManager[51654]: <info>  [1759394443.8627] device (tap49e98fa1-22): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:40:43 compute-0 ovn_controller[94821]: 2025-10-02T08:40:43Z|00234|binding|INFO|Setting lport 49e98fa1-221b-4416-a850-f14fd001fc00 ovn-installed in OVS
Oct 02 08:40:43 compute-0 nova_compute[192567]: 2025-10-02 08:40:43.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:43 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000001e.
Oct 02 08:40:44 compute-0 sshd-session[226999]: Received disconnect from 141.98.10.225 port 46842:11:  [preauth]
Oct 02 08:40:44 compute-0 sshd-session[226999]: Disconnected from authenticating user root 141.98.10.225 port 46842 [preauth]
Oct 02 08:40:44 compute-0 sshd-session[226999]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.10.225  user=root
Oct 02 08:40:44 compute-0 nova_compute[192567]: 2025-10-02 08:40:44.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:44 compute-0 nova_compute[192567]: 2025-10-02 08:40:44.784 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394444.7834146, 75e90fab-314f-4903-bec1-6446ea4ad7ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:40:44 compute-0 nova_compute[192567]: 2025-10-02 08:40:44.785 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] VM Started (Lifecycle Event)
Oct 02 08:40:44 compute-0 nova_compute[192567]: 2025-10-02 08:40:44.812 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:40:45 compute-0 nova_compute[192567]: 2025-10-02 08:40:45.691 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394445.6907206, 75e90fab-314f-4903-bec1-6446ea4ad7ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:40:45 compute-0 nova_compute[192567]: 2025-10-02 08:40:45.691 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] VM Resumed (Lifecycle Event)
Oct 02 08:40:45 compute-0 nova_compute[192567]: 2025-10-02 08:40:45.762 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:40:45 compute-0 nova_compute[192567]: 2025-10-02 08:40:45.767 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:40:45 compute-0 nova_compute[192567]: 2025-10-02 08:40:45.790 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.004 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.004 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.004 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.072 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:40:46 compute-0 ovn_controller[94821]: 2025-10-02T08:40:46Z|00235|binding|INFO|Claiming lport 49e98fa1-221b-4416-a850-f14fd001fc00 for this chassis.
Oct 02 08:40:46 compute-0 ovn_controller[94821]: 2025-10-02T08:40:46Z|00236|binding|INFO|49e98fa1-221b-4416-a850-f14fd001fc00: Claiming fa:16:3e:a2:ac:11 10.100.0.9
Oct 02 08:40:46 compute-0 ovn_controller[94821]: 2025-10-02T08:40:46Z|00237|binding|INFO|Setting lport 49e98fa1-221b-4416-a850-f14fd001fc00 up in Southbound
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.897 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:ac:11 10.100.0.9'], port_security=['fa:16:3e:a2:ac:11 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '75e90fab-314f-4903-bec1-6446ea4ad7ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a09407-34b6-42b5-8fee-510c4d23f792', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cfed6615d64e404ab1542b20621438d9', 'neutron:revision_number': '11', 'neutron:security_group_ids': '7a657333-92c9-49e2-9326-8e87ae1eae40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=177cfe6a-1924-46ab-8ec3-256ed9c4e2cc, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=49e98fa1-221b-4416-a850-f14fd001fc00) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.899 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 49e98fa1-221b-4416-a850-f14fd001fc00 in datapath 42a09407-34b6-42b5-8fee-510c4d23f792 bound to our chassis
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.902 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 42a09407-34b6-42b5-8fee-510c4d23f792
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.917 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[a8825b72-01f3-4c3d-a3d5-70c03c267a79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.919 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap42a09407-31 in ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.923 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap42a09407-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.923 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[ca5f42ed-2b22-4a9e-8331-63b3e9f57b0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.925 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa5a2ee-fc14-432a-88dc-1dbcb0aedf4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.940 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[87399661-4157-44b5-ab2b-2951630affb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:46.958 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[810b49e0-80ee-41a8-a3eb-0167e85faebf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.003 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[48bdf298-5fc5-49ab-9e94-2966d3b36258]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.011 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[610ef220-dbda-4c43-8915-b4cb79db6150]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:47 compute-0 NetworkManager[51654]: <info>  [1759394447.0132] manager: (tap42a09407-30): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.061 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[57f51b15-9a4c-4889-a760-a2b019cce869]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.066 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[e5f4b91d-d4d4-4b2a-a9e2-227fa7913b8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:47 compute-0 systemd-udevd[227200]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:40:47 compute-0 NetworkManager[51654]: <info>  [1759394447.1058] device (tap42a09407-30): carrier: link connected
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.116 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[cf931791-2b5e-4004-90a9-4c29f9b09a0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.141 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[4faac591-db3c-4ff7-88c6-b541ac695b12]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42a09407-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:f2:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533469, 'reachable_time': 36742, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227219, 'error': None, 'target': 'ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.164 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f6f2d906-b932-4b35-a4ea-10d406e5e207]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe15:f213'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533469, 'tstamp': 533469}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227220, 'error': None, 'target': 'ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.187 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e2fad236-b2e6-4182-95d6-43c6026e838e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42a09407-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:f2:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533469, 'reachable_time': 36742, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227221, 'error': None, 'target': 'ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.230 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[3c2c0e3b-a606-4fa7-81ac-2a53efeb64cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.320 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[1fb0b8fa-edae-409a-8153-8abdbd445135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.322 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42a09407-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.322 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.323 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap42a09407-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:40:47 compute-0 nova_compute[192567]: 2025-10-02 08:40:47.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:47 compute-0 kernel: tap42a09407-30: entered promiscuous mode
Oct 02 08:40:47 compute-0 NetworkManager[51654]: <info>  [1759394447.3287] manager: (tap42a09407-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Oct 02 08:40:47 compute-0 nova_compute[192567]: 2025-10-02 08:40:47.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.331 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap42a09407-30, col_values=(('external_ids', {'iface-id': '07a47f89-e193-4dc6-986d-f5fa01a04e07'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:40:47 compute-0 nova_compute[192567]: 2025-10-02 08:40:47.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:47 compute-0 ovn_controller[94821]: 2025-10-02T08:40:47Z|00238|binding|INFO|Releasing lport 07a47f89-e193-4dc6-986d-f5fa01a04e07 from this chassis (sb_readonly=0)
Oct 02 08:40:47 compute-0 nova_compute[192567]: 2025-10-02 08:40:47.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.358 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/42a09407-34b6-42b5-8fee-510c4d23f792.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/42a09407-34b6-42b5-8fee-510c4d23f792.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.359 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[73e68b3b-3ce6-47ff-8c44-e9ec152ef53c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.360 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-42a09407-34b6-42b5-8fee-510c4d23f792
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/42a09407-34b6-42b5-8fee-510c4d23f792.pid.haproxy
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID 42a09407-34b6-42b5-8fee-510c4d23f792
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:40:47 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 02 08:40:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:47.361 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792', 'env', 'PROCESS_TAG=haproxy-42a09407-34b6-42b5-8fee-510c4d23f792', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/42a09407-34b6-42b5-8fee-510c4d23f792.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:40:47 compute-0 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 02 08:40:47 compute-0 nova_compute[192567]: 2025-10-02 08:40:47.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:47 compute-0 nova_compute[192567]: 2025-10-02 08:40:47.768 2 INFO nova.compute.manager [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Post operation of migration started
Oct 02 08:40:47 compute-0 podman[227255]: 2025-10-02 08:40:47.784550769 +0000 UTC m=+0.058311123 container create 3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 02 08:40:47 compute-0 systemd[1]: Started libpod-conmon-3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7.scope.
Oct 02 08:40:47 compute-0 podman[227255]: 2025-10-02 08:40:47.755455745 +0000 UTC m=+0.029216089 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:40:47 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:40:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8afb6870ef1682b56b8a817e0e4f18cf147222160911cbe379961066515c917/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:40:47 compute-0 podman[227255]: 2025-10-02 08:40:47.900590005 +0000 UTC m=+0.174350379 container init 3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:40:47 compute-0 podman[227255]: 2025-10-02 08:40:47.912909633 +0000 UTC m=+0.186669987 container start 3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct 02 08:40:47 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[227270]: [NOTICE]   (227274) : New worker (227276) forked
Oct 02 08:40:47 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[227270]: [NOTICE]   (227274) : Loading success.
Oct 02 08:40:48 compute-0 nova_compute[192567]: 2025-10-02 08:40:48.045 2 DEBUG oslo_concurrency.lockutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-75e90fab-314f-4903-bec1-6446ea4ad7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:40:48 compute-0 nova_compute[192567]: 2025-10-02 08:40:48.046 2 DEBUG oslo_concurrency.lockutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-75e90fab-314f-4903-bec1-6446ea4ad7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:40:48 compute-0 nova_compute[192567]: 2025-10-02 08:40:48.046 2 DEBUG nova.network.neutron [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:40:48 compute-0 nova_compute[192567]: 2025-10-02 08:40:48.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:40:49 compute-0 nova_compute[192567]: 2025-10-02 08:40:49.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:50 compute-0 nova_compute[192567]: 2025-10-02 08:40:50.493 2 DEBUG nova.network.neutron [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Updating instance_info_cache with network_info: [{"id": "49e98fa1-221b-4416-a850-f14fd001fc00", "address": "fa:16:3e:a2:ac:11", "network": {"id": "42a09407-34b6-42b5-8fee-510c4d23f792", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-29438232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e0d4bcf8c1c401bb76039b2d2845a9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49e98fa1-22", "ovs_interfaceid": "49e98fa1-221b-4416-a850-f14fd001fc00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:40:50 compute-0 nova_compute[192567]: 2025-10-02 08:40:50.524 2 DEBUG oslo_concurrency.lockutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-75e90fab-314f-4903-bec1-6446ea4ad7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:40:50 compute-0 nova_compute[192567]: 2025-10-02 08:40:50.545 2 DEBUG oslo_concurrency.lockutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:40:50 compute-0 nova_compute[192567]: 2025-10-02 08:40:50.546 2 DEBUG oslo_concurrency.lockutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:40:50 compute-0 nova_compute[192567]: 2025-10-02 08:40:50.547 2 DEBUG oslo_concurrency.lockutils [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:40:50 compute-0 nova_compute[192567]: 2025-10-02 08:40:50.557 2 INFO nova.virt.libvirt.driver [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 02 08:40:50 compute-0 virtqemud[192112]: Domain id=22 name='instance-0000001e' uuid=75e90fab-314f-4903-bec1-6446ea4ad7ed is tainted: custom-monitor
Oct 02 08:40:50 compute-0 nova_compute[192567]: 2025-10-02 08:40:50.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:40:50 compute-0 nova_compute[192567]: 2025-10-02 08:40:50.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:40:50 compute-0 nova_compute[192567]: 2025-10-02 08:40:50.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:40:50 compute-0 nova_compute[192567]: 2025-10-02 08:40:50.649 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:40:51 compute-0 podman[227285]: 2025-10-02 08:40:51.173087072 +0000 UTC m=+0.081837481 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 02 08:40:51 compute-0 nova_compute[192567]: 2025-10-02 08:40:51.570 2 INFO nova.virt.libvirt.driver [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 02 08:40:51 compute-0 nova_compute[192567]: 2025-10-02 08:40:51.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:40:51 compute-0 nova_compute[192567]: 2025-10-02 08:40:51.647 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:40:51 compute-0 nova_compute[192567]: 2025-10-02 08:40:51.648 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:40:51 compute-0 nova_compute[192567]: 2025-10-02 08:40:51.648 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:40:51 compute-0 nova_compute[192567]: 2025-10-02 08:40:51.649 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:40:51 compute-0 nova_compute[192567]: 2025-10-02 08:40:51.721 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:40:51 compute-0 nova_compute[192567]: 2025-10-02 08:40:51.795 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:40:51 compute-0 nova_compute[192567]: 2025-10-02 08:40:51.796 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:40:51 compute-0 nova_compute[192567]: 2025-10-02 08:40:51.891 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.098 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.099 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5681MB free_disk=73.43299865722656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.099 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.100 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.156 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Migration for instance 75e90fab-314f-4903-bec1-6446ea4ad7ed refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.177 2 INFO nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Updating resource usage from migration 563802e7-260c-401a-b026-8e170cecda82
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.178 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Starting to track incoming migration 563802e7-260c-401a-b026-8e170cecda82 with flavor 932d352e-81e8-4137-94d3-19616d5c2ae2 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.256 2 WARNING nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance 75e90fab-314f-4903-bec1-6446ea4ad7ed has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.256 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.257 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.273 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing inventories for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.292 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating ProviderTree inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.293 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.310 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing aggregate associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.338 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing trait associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.377 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.403 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.443 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.443 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.576 2 INFO nova.virt.libvirt.driver [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.583 2 DEBUG nova.compute.manager [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:40:52 compute-0 nova_compute[192567]: 2025-10-02 08:40:52.606 2 DEBUG nova.objects.instance [None req-cc2cfb2d-9da7-4b90-9897-43308258586e f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Oct 02 08:40:54 compute-0 nova_compute[192567]: 2025-10-02 08:40:54.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:54 compute-0 nova_compute[192567]: 2025-10-02 08:40:54.444 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:40:54 compute-0 nova_compute[192567]: 2025-10-02 08:40:54.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:40:54 compute-0 nova_compute[192567]: 2025-10-02 08:40:54.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:40:54 compute-0 nova_compute[192567]: 2025-10-02 08:40:54.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:40:57 compute-0 nova_compute[192567]: 2025-10-02 08:40:57.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.348 2 DEBUG oslo_concurrency.lockutils [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Acquiring lock "75e90fab-314f-4903-bec1-6446ea4ad7ed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.350 2 DEBUG oslo_concurrency.lockutils [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lock "75e90fab-314f-4903-bec1-6446ea4ad7ed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.350 2 DEBUG oslo_concurrency.lockutils [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Acquiring lock "75e90fab-314f-4903-bec1-6446ea4ad7ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.351 2 DEBUG oslo_concurrency.lockutils [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lock "75e90fab-314f-4903-bec1-6446ea4ad7ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.351 2 DEBUG oslo_concurrency.lockutils [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lock "75e90fab-314f-4903-bec1-6446ea4ad7ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.353 2 INFO nova.compute.manager [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Terminating instance
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.354 2 DEBUG nova.compute.manager [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:40:58 compute-0 kernel: tap49e98fa1-22 (unregistering): left promiscuous mode
Oct 02 08:40:58 compute-0 NetworkManager[51654]: <info>  [1759394458.3797] device (tap49e98fa1-22): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:58 compute-0 ovn_controller[94821]: 2025-10-02T08:40:58Z|00239|binding|INFO|Releasing lport 49e98fa1-221b-4416-a850-f14fd001fc00 from this chassis (sb_readonly=0)
Oct 02 08:40:58 compute-0 ovn_controller[94821]: 2025-10-02T08:40:58Z|00240|binding|INFO|Setting lport 49e98fa1-221b-4416-a850-f14fd001fc00 down in Southbound
Oct 02 08:40:58 compute-0 ovn_controller[94821]: 2025-10-02T08:40:58Z|00241|binding|INFO|Removing iface tap49e98fa1-22 ovn-installed in OVS
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.398 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:ac:11 10.100.0.9'], port_security=['fa:16:3e:a2:ac:11 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '75e90fab-314f-4903-bec1-6446ea4ad7ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a09407-34b6-42b5-8fee-510c4d23f792', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cfed6615d64e404ab1542b20621438d9', 'neutron:revision_number': '13', 'neutron:security_group_ids': '7a657333-92c9-49e2-9326-8e87ae1eae40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=177cfe6a-1924-46ab-8ec3-256ed9c4e2cc, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=49e98fa1-221b-4416-a850-f14fd001fc00) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.400 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 49e98fa1-221b-4416-a850-f14fd001fc00 in datapath 42a09407-34b6-42b5-8fee-510c4d23f792 unbound from our chassis
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.402 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42a09407-34b6-42b5-8fee-510c4d23f792, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.403 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[d830e5c2-3f0b-49b5-9774-e448d2ec4929]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.404 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792 namespace which is not needed anymore
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:58 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Oct 02 08:40:58 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001e.scope: Consumed 2.062s CPU time.
Oct 02 08:40:58 compute-0 systemd-machined[152597]: Machine qemu-22-instance-0000001e terminated.
Oct 02 08:40:58 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[227270]: [NOTICE]   (227274) : haproxy version is 2.8.14-c23fe91
Oct 02 08:40:58 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[227270]: [NOTICE]   (227274) : path to executable is /usr/sbin/haproxy
Oct 02 08:40:58 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[227270]: [WARNING]  (227274) : Exiting Master process...
Oct 02 08:40:58 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[227270]: [ALERT]    (227274) : Current worker (227276) exited with code 143 (Terminated)
Oct 02 08:40:58 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[227270]: [WARNING]  (227274) : All workers exited. Exiting... (0)
Oct 02 08:40:58 compute-0 systemd[1]: libpod-3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7.scope: Deactivated successfully.
Oct 02 08:40:58 compute-0 conmon[227270]: conmon 3dac6913c24c38e2a66f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7.scope/container/memory.events
Oct 02 08:40:58 compute-0 podman[227340]: 2025-10-02 08:40:58.56411929 +0000 UTC m=+0.055786615 container died 3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7-userdata-shm.mount: Deactivated successfully.
Oct 02 08:40:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-f8afb6870ef1682b56b8a817e0e4f18cf147222160911cbe379961066515c917-merged.mount: Deactivated successfully.
Oct 02 08:40:58 compute-0 podman[227340]: 2025-10-02 08:40:58.611098236 +0000 UTC m=+0.102765561 container cleanup 3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct 02 08:40:58 compute-0 systemd[1]: libpod-conmon-3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7.scope: Deactivated successfully.
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.633 2 DEBUG nova.compute.manager [req-32f914e5-88fe-4e96-9633-29770caf4bea req-a3ae3b24-7ff6-4d98-a335-6af835748c13 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Received event network-vif-unplugged-49e98fa1-221b-4416-a850-f14fd001fc00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.633 2 DEBUG oslo_concurrency.lockutils [req-32f914e5-88fe-4e96-9633-29770caf4bea req-a3ae3b24-7ff6-4d98-a335-6af835748c13 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "75e90fab-314f-4903-bec1-6446ea4ad7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.634 2 DEBUG oslo_concurrency.lockutils [req-32f914e5-88fe-4e96-9633-29770caf4bea req-a3ae3b24-7ff6-4d98-a335-6af835748c13 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "75e90fab-314f-4903-bec1-6446ea4ad7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.634 2 DEBUG oslo_concurrency.lockutils [req-32f914e5-88fe-4e96-9633-29770caf4bea req-a3ae3b24-7ff6-4d98-a335-6af835748c13 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "75e90fab-314f-4903-bec1-6446ea4ad7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.634 2 DEBUG nova.compute.manager [req-32f914e5-88fe-4e96-9633-29770caf4bea req-a3ae3b24-7ff6-4d98-a335-6af835748c13 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] No waiting events found dispatching network-vif-unplugged-49e98fa1-221b-4416-a850-f14fd001fc00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.635 2 DEBUG nova.compute.manager [req-32f914e5-88fe-4e96-9633-29770caf4bea req-a3ae3b24-7ff6-4d98-a335-6af835748c13 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Received event network-vif-unplugged-49e98fa1-221b-4416-a850-f14fd001fc00 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.639 2 INFO nova.virt.libvirt.driver [-] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Instance destroyed successfully.
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.640 2 DEBUG nova.objects.instance [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lazy-loading 'resources' on Instance uuid 75e90fab-314f-4903-bec1-6446ea4ad7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.653 2 DEBUG nova.virt.libvirt.vif [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T08:39:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-2095835694',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-2095835694',id=30,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:40:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cfed6615d64e404ab1542b20621438d9',ramdisk_id='',reservation_id='r-u3yn0w0c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',clean_attempts='1',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-2031848124',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-2031848124-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:40:52Z,user_data=None,user_id='ab2d5dc08c96417b93ba3fc03cddf0cd',uuid=75e90fab-314f-4903-bec1-6446ea4ad7ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "49e98fa1-221b-4416-a850-f14fd001fc00", "address": "fa:16:3e:a2:ac:11", "network": {"id": "42a09407-34b6-42b5-8fee-510c4d23f792", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-29438232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e0d4bcf8c1c401bb76039b2d2845a9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49e98fa1-22", "ovs_interfaceid": "49e98fa1-221b-4416-a850-f14fd001fc00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.654 2 DEBUG nova.network.os_vif_util [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Converting VIF {"id": "49e98fa1-221b-4416-a850-f14fd001fc00", "address": "fa:16:3e:a2:ac:11", "network": {"id": "42a09407-34b6-42b5-8fee-510c4d23f792", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-29438232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e0d4bcf8c1c401bb76039b2d2845a9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49e98fa1-22", "ovs_interfaceid": "49e98fa1-221b-4416-a850-f14fd001fc00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.656 2 DEBUG nova.network.os_vif_util [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a2:ac:11,bridge_name='br-int',has_traffic_filtering=True,id=49e98fa1-221b-4416-a850-f14fd001fc00,network=Network(42a09407-34b6-42b5-8fee-510c4d23f792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49e98fa1-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.656 2 DEBUG os_vif [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:ac:11,bridge_name='br-int',has_traffic_filtering=True,id=49e98fa1-221b-4416-a850-f14fd001fc00,network=Network(42a09407-34b6-42b5-8fee-510c4d23f792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49e98fa1-22') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.659 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap49e98fa1-22, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.665 2 INFO os_vif [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:ac:11,bridge_name='br-int',has_traffic_filtering=True,id=49e98fa1-221b-4416-a850-f14fd001fc00,network=Network(42a09407-34b6-42b5-8fee-510c4d23f792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49e98fa1-22')
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.666 2 INFO nova.virt.libvirt.driver [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Deleting instance files /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed_del
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.667 2 INFO nova.virt.libvirt.driver [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Deletion of /var/lib/nova/instances/75e90fab-314f-4903-bec1-6446ea4ad7ed_del complete
Oct 02 08:40:58 compute-0 podman[227386]: 2025-10-02 08:40:58.692810762 +0000 UTC m=+0.048527795 container remove 3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.700 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8778b869-8990-44df-901e-a4d80e2bc93f]: (4, ('Thu Oct  2 08:40:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792 (3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7)\n3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7\nThu Oct  2 08:40:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792 (3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7)\n3dac6913c24c38e2a66f972dfae1ab00bd99a97d5e5f4ec9bc700095796bd6f7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.702 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[da96470a-f8cc-4f26-80e2-9b864ce893b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.703 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42a09407-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:58 compute-0 kernel: tap42a09407-30: left promiscuous mode
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.715 2 INFO nova.compute.manager [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Took 0.36 seconds to destroy the instance on the hypervisor.
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.716 2 DEBUG oslo.service.loopingcall [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.717 2 DEBUG nova.compute.manager [-] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.717 2 DEBUG nova.network.neutron [-] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:40:58 compute-0 nova_compute[192567]: 2025-10-02 08:40:58.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.731 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[74a69b67-205e-4732-8c02-b2276eedaca6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.757 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[a124ad7d-7724-4cd2-809c-dffd5ed288ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.759 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[11e7a5fd-8139-4d8d-8702-5fc3b6d4fe99]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.778 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[b4339bfe-d98a-4bc9-b2e8-40d8079dee78]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533458, 'reachable_time': 25057, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227402, 'error': None, 'target': 'ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.781 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:40:58 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:40:58.781 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[25bac017-19ff-43ab-a8ac-07d95a740efe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:40:58 compute-0 systemd[1]: run-netns-ovnmeta\x2d42a09407\x2d34b6\x2d42b5\x2d8fee\x2d510c4d23f792.mount: Deactivated successfully.
Oct 02 08:40:59 compute-0 nova_compute[192567]: 2025-10-02 08:40:59.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:40:59 compute-0 nova_compute[192567]: 2025-10-02 08:40:59.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:40:59 compute-0 nova_compute[192567]: 2025-10-02 08:40:59.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:40:59 compute-0 podman[203011]: time="2025-10-02T08:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:40:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:40:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.481 2 DEBUG nova.network.neutron [-] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.508 2 INFO nova.compute.manager [-] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Took 1.79 seconds to deallocate network for instance.
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.563 2 DEBUG oslo_concurrency.lockutils [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.563 2 DEBUG oslo_concurrency.lockutils [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.569 2 DEBUG oslo_concurrency.lockutils [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.589 2 DEBUG nova.compute.manager [req-e4d78593-3935-449d-b54e-93950bdc5614 req-82ebdeb9-f63e-40b1-9f5c-3706160d2a93 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Received event network-vif-deleted-49e98fa1-221b-4416-a850-f14fd001fc00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.637 2 INFO nova.scheduler.client.report [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Deleted allocations for instance 75e90fab-314f-4903-bec1-6446ea4ad7ed
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.698 2 DEBUG oslo_concurrency.lockutils [None req-6a0f83b6-5b65-41da-8772-ec55f2e3a9a3 ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lock "75e90fab-314f-4903-bec1-6446ea4ad7ed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.747 2 DEBUG nova.compute.manager [req-adef9429-500b-42f5-b1e6-9054d63f39b9 req-f4537d66-bc06-400d-943d-c2c5a58ae575 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Received event network-vif-plugged-49e98fa1-221b-4416-a850-f14fd001fc00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.748 2 DEBUG oslo_concurrency.lockutils [req-adef9429-500b-42f5-b1e6-9054d63f39b9 req-f4537d66-bc06-400d-943d-c2c5a58ae575 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "75e90fab-314f-4903-bec1-6446ea4ad7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.748 2 DEBUG oslo_concurrency.lockutils [req-adef9429-500b-42f5-b1e6-9054d63f39b9 req-f4537d66-bc06-400d-943d-c2c5a58ae575 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "75e90fab-314f-4903-bec1-6446ea4ad7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.748 2 DEBUG oslo_concurrency.lockutils [req-adef9429-500b-42f5-b1e6-9054d63f39b9 req-f4537d66-bc06-400d-943d-c2c5a58ae575 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "75e90fab-314f-4903-bec1-6446ea4ad7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.748 2 DEBUG nova.compute.manager [req-adef9429-500b-42f5-b1e6-9054d63f39b9 req-f4537d66-bc06-400d-943d-c2c5a58ae575 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] No waiting events found dispatching network-vif-plugged-49e98fa1-221b-4416-a850-f14fd001fc00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:41:00 compute-0 nova_compute[192567]: 2025-10-02 08:41:00.749 2 WARNING nova.compute.manager [req-adef9429-500b-42f5-b1e6-9054d63f39b9 req-f4537d66-bc06-400d-943d-c2c5a58ae575 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Received unexpected event network-vif-plugged-49e98fa1-221b-4416-a850-f14fd001fc00 for instance with vm_state deleted and task_state None.
Oct 02 08:41:01 compute-0 openstack_network_exporter[205118]: ERROR   08:41:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:41:01 compute-0 openstack_network_exporter[205118]: ERROR   08:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:41:01 compute-0 openstack_network_exporter[205118]: ERROR   08:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:41:01 compute-0 openstack_network_exporter[205118]: ERROR   08:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:41:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:41:01 compute-0 openstack_network_exporter[205118]: ERROR   08:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:41:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:41:03 compute-0 nova_compute[192567]: 2025-10-02 08:41:03.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:04 compute-0 nova_compute[192567]: 2025-10-02 08:41:04.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:04 compute-0 nova_compute[192567]: 2025-10-02 08:41:04.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:41:06 compute-0 podman[227403]: 2025-10-02 08:41:06.177136651 +0000 UTC m=+0.079803768 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:41:06 compute-0 podman[227406]: 2025-10-02 08:41:06.177222364 +0000 UTC m=+0.068428031 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:41:06 compute-0 podman[227405]: 2025-10-02 08:41:06.217407326 +0000 UTC m=+0.107319513 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:41:06 compute-0 podman[227404]: 2025-10-02 08:41:06.217558981 +0000 UTC m=+0.115853911 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 02 08:41:08 compute-0 nova_compute[192567]: 2025-10-02 08:41:08.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:08 compute-0 sshd-session[227488]: Invalid user loginuser from 193.32.162.151 port 54778
Oct 02 08:41:08 compute-0 sshd-session[227488]: pam_unix(sshd:auth): check pass; user unknown
Oct 02 08:41:08 compute-0 sshd-session[227488]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.32.162.151
Oct 02 08:41:09 compute-0 nova_compute[192567]: 2025-10-02 08:41:09.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:10 compute-0 sshd-session[227488]: Failed password for invalid user loginuser from 193.32.162.151 port 54778 ssh2
Oct 02 08:41:11 compute-0 podman[227490]: 2025-10-02 08:41:11.146116109 +0000 UTC m=+0.063438293 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:41:12 compute-0 sshd-session[227488]: Connection closed by invalid user loginuser 193.32.162.151 port 54778 [preauth]
Oct 02 08:41:13 compute-0 nova_compute[192567]: 2025-10-02 08:41:13.636 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759394458.635366, 75e90fab-314f-4903-bec1-6446ea4ad7ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:41:13 compute-0 nova_compute[192567]: 2025-10-02 08:41:13.637 2 INFO nova.compute.manager [-] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] VM Stopped (Lifecycle Event)
Oct 02 08:41:13 compute-0 nova_compute[192567]: 2025-10-02 08:41:13.663 2 DEBUG nova.compute.manager [None req-87175a06-e421-428e-b312-e8eedc441837 - - - - - -] [instance: 75e90fab-314f-4903-bec1-6446ea4ad7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:41:13 compute-0 nova_compute[192567]: 2025-10-02 08:41:13.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.625 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.626 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.627 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.627 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.628 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.628 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.662 2 DEBUG nova.virt.libvirt.imagecache [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.663 2 WARNING nova.virt.libvirt.imagecache [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.663 2 INFO nova.virt.libvirt.imagecache [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Removable base files: /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.664 2 INFO nova.virt.libvirt.imagecache [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.665 2 DEBUG nova.virt.libvirt.imagecache [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.665 2 DEBUG nova.virt.libvirt.imagecache [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Oct 02 08:41:14 compute-0 nova_compute[192567]: 2025-10-02 08:41:14.665 2 DEBUG nova.virt.libvirt.imagecache [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Oct 02 08:41:18 compute-0 nova_compute[192567]: 2025-10-02 08:41:18.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:19 compute-0 nova_compute[192567]: 2025-10-02 08:41:19.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:22 compute-0 podman[227514]: 2025-10-02 08:41:22.158876826 +0000 UTC m=+0.074285575 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:41:23 compute-0 nova_compute[192567]: 2025-10-02 08:41:23.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:24 compute-0 nova_compute[192567]: 2025-10-02 08:41:24.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:28 compute-0 nova_compute[192567]: 2025-10-02 08:41:28.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:29 compute-0 nova_compute[192567]: 2025-10-02 08:41:29.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:29 compute-0 podman[203011]: time="2025-10-02T08:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:41:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:41:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Oct 02 08:41:31 compute-0 openstack_network_exporter[205118]: ERROR   08:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:41:31 compute-0 openstack_network_exporter[205118]: ERROR   08:41:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:41:31 compute-0 openstack_network_exporter[205118]: ERROR   08:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:41:31 compute-0 openstack_network_exporter[205118]: ERROR   08:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:41:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:41:31 compute-0 openstack_network_exporter[205118]: ERROR   08:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:41:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:41:33 compute-0 nova_compute[192567]: 2025-10-02 08:41:33.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:34 compute-0 nova_compute[192567]: 2025-10-02 08:41:34.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:34 compute-0 sshd-session[227536]: Connection closed by 119.29.53.66 port 57774
Oct 02 08:41:37 compute-0 podman[227538]: 2025-10-02 08:41:37.180366893 +0000 UTC m=+0.081322666 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Oct 02 08:41:37 compute-0 podman[227540]: 2025-10-02 08:41:37.208710753 +0000 UTC m=+0.100155577 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3)
Oct 02 08:41:37 compute-0 podman[227541]: 2025-10-02 08:41:37.20955697 +0000 UTC m=+0.094209871 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true)
Oct 02 08:41:37 compute-0 podman[227539]: 2025-10-02 08:41:37.236021901 +0000 UTC m=+0.134331461 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 02 08:41:38 compute-0 nova_compute[192567]: 2025-10-02 08:41:38.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:39 compute-0 nova_compute[192567]: 2025-10-02 08:41:39.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:42 compute-0 podman[227618]: 2025-10-02 08:41:42.175007208 +0000 UTC m=+0.079146447 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:41:43 compute-0 nova_compute[192567]: 2025-10-02 08:41:43.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:44 compute-0 nova_compute[192567]: 2025-10-02 08:41:44.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:44 compute-0 sshd-session[227537]: error: kex_exchange_identification: read: Connection timed out
Oct 02 08:41:44 compute-0 sshd-session[227537]: banner exchange: Connection from 119.29.53.66 port 35030: Connection timed out
Oct 02 08:41:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:41:46.005 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:41:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:41:46.006 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:41:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:41:46.006 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:41:46 compute-0 nova_compute[192567]: 2025-10-02 08:41:46.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:41:46 compute-0 nova_compute[192567]: 2025-10-02 08:41:46.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 02 08:41:46 compute-0 nova_compute[192567]: 2025-10-02 08:41:46.648 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 02 08:41:48 compute-0 nova_compute[192567]: 2025-10-02 08:41:48.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:49 compute-0 nova_compute[192567]: 2025-10-02 08:41:49.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:49 compute-0 nova_compute[192567]: 2025-10-02 08:41:49.644 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:41:50 compute-0 nova_compute[192567]: 2025-10-02 08:41:50.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:41:50 compute-0 nova_compute[192567]: 2025-10-02 08:41:50.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:41:50 compute-0 nova_compute[192567]: 2025-10-02 08:41:50.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:41:50 compute-0 nova_compute[192567]: 2025-10-02 08:41:50.746 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:41:52 compute-0 nova_compute[192567]: 2025-10-02 08:41:52.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:41:52 compute-0 nova_compute[192567]: 2025-10-02 08:41:52.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 02 08:41:53 compute-0 podman[227642]: 2025-10-02 08:41:53.18066623 +0000 UTC m=+0.090125212 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.637 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.665 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.666 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.666 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.666 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.851 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.852 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5872MB free_disk=73.46203231811523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.852 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.852 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.928 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.928 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.956 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:41:53 compute-0 nova_compute[192567]: 2025-10-02 08:41:53.984 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:41:54 compute-0 nova_compute[192567]: 2025-10-02 08:41:54.014 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:41:54 compute-0 nova_compute[192567]: 2025-10-02 08:41:54.015 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:41:54 compute-0 nova_compute[192567]: 2025-10-02 08:41:54.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:55 compute-0 nova_compute[192567]: 2025-10-02 08:41:55.002 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:41:55 compute-0 nova_compute[192567]: 2025-10-02 08:41:55.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:41:55.023 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:41:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:41:55.024 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:41:55 compute-0 nova_compute[192567]: 2025-10-02 08:41:55.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:41:55 compute-0 nova_compute[192567]: 2025-10-02 08:41:55.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:41:56 compute-0 nova_compute[192567]: 2025-10-02 08:41:56.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:41:57 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:41:57.027 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:41:57 compute-0 nova_compute[192567]: 2025-10-02 08:41:57.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:41:58 compute-0 nova_compute[192567]: 2025-10-02 08:41:58.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:59 compute-0 nova_compute[192567]: 2025-10-02 08:41:59.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:41:59 compute-0 podman[203011]: time="2025-10-02T08:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:41:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:41:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3007 "" "Go-http-client/1.1"
Oct 02 08:42:00 compute-0 nova_compute[192567]: 2025-10-02 08:42:00.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:42:00 compute-0 nova_compute[192567]: 2025-10-02 08:42:00.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:42:01 compute-0 openstack_network_exporter[205118]: ERROR   08:42:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:42:01 compute-0 openstack_network_exporter[205118]: ERROR   08:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:42:01 compute-0 openstack_network_exporter[205118]: ERROR   08:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:42:01 compute-0 openstack_network_exporter[205118]: ERROR   08:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:42:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:42:01 compute-0 openstack_network_exporter[205118]: ERROR   08:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:42:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:42:03 compute-0 nova_compute[192567]: 2025-10-02 08:42:03.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:04 compute-0 nova_compute[192567]: 2025-10-02 08:42:04.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:05 compute-0 nova_compute[192567]: 2025-10-02 08:42:05.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:42:08 compute-0 podman[227666]: 2025-10-02 08:42:08.186016399 +0000 UTC m=+0.086589291 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Oct 02 08:42:08 compute-0 podman[227668]: 2025-10-02 08:42:08.214313008 +0000 UTC m=+0.113611870 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd)
Oct 02 08:42:08 compute-0 podman[227669]: 2025-10-02 08:42:08.215120884 +0000 UTC m=+0.099095084 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:42:08 compute-0 podman[227667]: 2025-10-02 08:42:08.236245947 +0000 UTC m=+0.136818999 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, container_name=ovn_controller)
Oct 02 08:42:08 compute-0 nova_compute[192567]: 2025-10-02 08:42:08.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:09 compute-0 nova_compute[192567]: 2025-10-02 08:42:09.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:13 compute-0 podman[227748]: 2025-10-02 08:42:13.136042348 +0000 UTC m=+0.055267118 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:42:13 compute-0 nova_compute[192567]: 2025-10-02 08:42:13.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:42:13 compute-0 nova_compute[192567]: 2025-10-02 08:42:13.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:14 compute-0 nova_compute[192567]: 2025-10-02 08:42:14.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:18 compute-0 nova_compute[192567]: 2025-10-02 08:42:18.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:19 compute-0 nova_compute[192567]: 2025-10-02 08:42:19.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:23 compute-0 nova_compute[192567]: 2025-10-02 08:42:23.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:24 compute-0 podman[227772]: 2025-10-02 08:42:24.195258038 +0000 UTC m=+0.097615113 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9)
Oct 02 08:42:24 compute-0 nova_compute[192567]: 2025-10-02 08:42:24.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:25 compute-0 nova_compute[192567]: 2025-10-02 08:42:25.081 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:42:28 compute-0 nova_compute[192567]: 2025-10-02 08:42:28.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:29 compute-0 nova_compute[192567]: 2025-10-02 08:42:29.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:29 compute-0 podman[203011]: time="2025-10-02T08:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:42:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:42:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3009 "" "Go-http-client/1.1"
Oct 02 08:42:31 compute-0 openstack_network_exporter[205118]: ERROR   08:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:42:31 compute-0 openstack_network_exporter[205118]: ERROR   08:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:42:31 compute-0 openstack_network_exporter[205118]: ERROR   08:42:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:42:31 compute-0 openstack_network_exporter[205118]: ERROR   08:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:42:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:42:31 compute-0 openstack_network_exporter[205118]: ERROR   08:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:42:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:42:33 compute-0 nova_compute[192567]: 2025-10-02 08:42:33.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:34 compute-0 nova_compute[192567]: 2025-10-02 08:42:34.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:38 compute-0 nova_compute[192567]: 2025-10-02 08:42:38.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:39 compute-0 podman[227795]: 2025-10-02 08:42:39.208623614 +0000 UTC m=+0.088463499 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 02 08:42:39 compute-0 podman[227796]: 2025-10-02 08:42:39.222417073 +0000 UTC m=+0.099433851 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 02 08:42:39 compute-0 podman[227793]: 2025-10-02 08:42:39.224417454 +0000 UTC m=+0.119709490 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:42:39 compute-0 podman[227794]: 2025-10-02 08:42:39.278516385 +0000 UTC m=+0.162874040 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 02 08:42:39 compute-0 nova_compute[192567]: 2025-10-02 08:42:39.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:43 compute-0 nova_compute[192567]: 2025-10-02 08:42:43.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:44 compute-0 podman[227875]: 2025-10-02 08:42:44.171712484 +0000 UTC m=+0.080630746 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:42:44 compute-0 nova_compute[192567]: 2025-10-02 08:42:44.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:42:46.007 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:42:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:42:46.008 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:42:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:42:46.008 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:42:46 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 02 08:42:48 compute-0 nova_compute[192567]: 2025-10-02 08:42:48.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:49 compute-0 nova_compute[192567]: 2025-10-02 08:42:49.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:51 compute-0 nova_compute[192567]: 2025-10-02 08:42:51.268 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:42:51 compute-0 nova_compute[192567]: 2025-10-02 08:42:51.269 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:42:51 compute-0 nova_compute[192567]: 2025-10-02 08:42:51.269 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:42:51 compute-0 nova_compute[192567]: 2025-10-02 08:42:51.269 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:42:51 compute-0 nova_compute[192567]: 2025-10-02 08:42:51.293 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:42:53 compute-0 nova_compute[192567]: 2025-10-02 08:42:53.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.664 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.664 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.664 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.665 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.834 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.835 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5875MB free_disk=73.46184921264648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.835 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.835 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.931 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.932 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:42:54 compute-0 nova_compute[192567]: 2025-10-02 08:42:54.993 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:42:55 compute-0 nova_compute[192567]: 2025-10-02 08:42:55.009 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:42:55 compute-0 nova_compute[192567]: 2025-10-02 08:42:55.010 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:42:55 compute-0 nova_compute[192567]: 2025-10-02 08:42:55.010 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:42:55 compute-0 podman[227901]: 2025-10-02 08:42:55.193507072 +0000 UTC m=+0.096625203 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:42:56 compute-0 nova_compute[192567]: 2025-10-02 08:42:56.010 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:42:56 compute-0 nova_compute[192567]: 2025-10-02 08:42:56.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:42:56 compute-0 nova_compute[192567]: 2025-10-02 08:42:56.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:42:58 compute-0 nova_compute[192567]: 2025-10-02 08:42:58.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:58 compute-0 ovn_controller[94821]: 2025-10-02T08:42:58Z|00242|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Oct 02 08:42:59 compute-0 nova_compute[192567]: 2025-10-02 08:42:59.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:42:59 compute-0 podman[203011]: time="2025-10-02T08:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:42:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:42:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3013 "" "Go-http-client/1.1"
Oct 02 08:43:01 compute-0 openstack_network_exporter[205118]: ERROR   08:43:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:43:01 compute-0 openstack_network_exporter[205118]: ERROR   08:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:43:01 compute-0 openstack_network_exporter[205118]: ERROR   08:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:43:01 compute-0 openstack_network_exporter[205118]: ERROR   08:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:43:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:43:01 compute-0 openstack_network_exporter[205118]: ERROR   08:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:43:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:43:02 compute-0 nova_compute[192567]: 2025-10-02 08:43:02.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:43:02 compute-0 nova_compute[192567]: 2025-10-02 08:43:02.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:43:03 compute-0 nova_compute[192567]: 2025-10-02 08:43:03.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:04 compute-0 nova_compute[192567]: 2025-10-02 08:43:04.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:06 compute-0 nova_compute[192567]: 2025-10-02 08:43:06.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:43:08 compute-0 nova_compute[192567]: 2025-10-02 08:43:08.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:09 compute-0 nova_compute[192567]: 2025-10-02 08:43:09.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:10 compute-0 podman[227924]: 2025-10-02 08:43:10.174171373 +0000 UTC m=+0.080177588 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 02 08:43:10 compute-0 podman[227926]: 2025-10-02 08:43:10.199279524 +0000 UTC m=+0.088218797 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 02 08:43:10 compute-0 podman[227932]: 2025-10-02 08:43:10.230171926 +0000 UTC m=+0.112836504 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid)
Oct 02 08:43:10 compute-0 podman[227925]: 2025-10-02 08:43:10.235692848 +0000 UTC m=+0.129078740 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 02 08:43:13 compute-0 nova_compute[192567]: 2025-10-02 08:43:13.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:14 compute-0 nova_compute[192567]: 2025-10-02 08:43:14.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:15 compute-0 podman[228004]: 2025-10-02 08:43:15.170716231 +0000 UTC m=+0.077158353 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:43:18 compute-0 nova_compute[192567]: 2025-10-02 08:43:18.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:19 compute-0 nova_compute[192567]: 2025-10-02 08:43:19.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:23 compute-0 nova_compute[192567]: 2025-10-02 08:43:23.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:24 compute-0 nova_compute[192567]: 2025-10-02 08:43:24.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:26 compute-0 podman[228028]: 2025-10-02 08:43:26.218599266 +0000 UTC m=+0.119959176 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, io.openshift.expose-services=, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9)
Oct 02 08:43:28 compute-0 nova_compute[192567]: 2025-10-02 08:43:28.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:29 compute-0 nova_compute[192567]: 2025-10-02 08:43:29.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:29 compute-0 podman[203011]: time="2025-10-02T08:43:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:43:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:43:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:43:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:43:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 02 08:43:31 compute-0 openstack_network_exporter[205118]: ERROR   08:43:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:43:31 compute-0 openstack_network_exporter[205118]: ERROR   08:43:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:43:31 compute-0 openstack_network_exporter[205118]: ERROR   08:43:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:43:31 compute-0 openstack_network_exporter[205118]: ERROR   08:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:43:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:43:31 compute-0 openstack_network_exporter[205118]: ERROR   08:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:43:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:43:33 compute-0 nova_compute[192567]: 2025-10-02 08:43:33.310 2 DEBUG nova.virt.libvirt.driver [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Creating tmpfile /var/lib/nova/instances/tmpz81wlw30 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Oct 02 08:43:33 compute-0 nova_compute[192567]: 2025-10-02 08:43:33.311 2 DEBUG nova.compute.manager [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpz81wlw30',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Oct 02 08:43:33 compute-0 nova_compute[192567]: 2025-10-02 08:43:33.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:34 compute-0 nova_compute[192567]: 2025-10-02 08:43:34.388 2 DEBUG nova.compute.manager [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpz81wlw30',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fbaabbd4-48b4-4f5e-a7e6-e71da0917c01',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Oct 02 08:43:34 compute-0 nova_compute[192567]: 2025-10-02 08:43:34.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:34 compute-0 nova_compute[192567]: 2025-10-02 08:43:34.420 2 DEBUG oslo_concurrency.lockutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-fbaabbd4-48b4-4f5e-a7e6-e71da0917c01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:43:34 compute-0 nova_compute[192567]: 2025-10-02 08:43:34.421 2 DEBUG oslo_concurrency.lockutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-fbaabbd4-48b4-4f5e-a7e6-e71da0917c01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:43:34 compute-0 nova_compute[192567]: 2025-10-02 08:43:34.421 2 DEBUG nova.network.neutron [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.570 2 DEBUG nova.network.neutron [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Updating instance_info_cache with network_info: [{"id": "01051fd6-1d94-4199-bdca-cc108fc67855", "address": "fa:16:3e:ed:8b:0c", "network": {"id": "42a09407-34b6-42b5-8fee-510c4d23f792", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-29438232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e0d4bcf8c1c401bb76039b2d2845a9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01051fd6-1d", "ovs_interfaceid": "01051fd6-1d94-4199-bdca-cc108fc67855", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.600 2 DEBUG oslo_concurrency.lockutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-fbaabbd4-48b4-4f5e-a7e6-e71da0917c01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.604 2 DEBUG nova.virt.libvirt.driver [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpz81wlw30',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fbaabbd4-48b4-4f5e-a7e6-e71da0917c01',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.605 2 DEBUG nova.virt.libvirt.driver [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Creating instance directory: /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.606 2 DEBUG nova.virt.libvirt.driver [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Creating disk.info with the contents: {'/var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk': 'qcow2', '/var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.607 2 DEBUG nova.virt.libvirt.driver [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.608 2 DEBUG nova.objects.instance [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid fbaabbd4-48b4-4f5e-a7e6-e71da0917c01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.657 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.751 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.756 2 DEBUG oslo_concurrency.lockutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.757 2 DEBUG oslo_concurrency.lockutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.779 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.870 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.871 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.918 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.920 2 DEBUG oslo_concurrency.lockutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:43:38 compute-0 nova_compute[192567]: 2025-10-02 08:43:38.921 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.010 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.012 2 DEBUG nova.virt.disk.api [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.012 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.110 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.113 2 DEBUG nova.virt.disk.api [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.113 2 DEBUG nova.objects.instance [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid fbaabbd4-48b4-4f5e-a7e6-e71da0917c01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.160 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.209 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk.config 485376" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.212 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk.config to /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.213 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk.config /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.869 2 DEBUG oslo_concurrency.processutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk.config /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01" returned: 0 in 0.656s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.870 2 DEBUG nova.virt.libvirt.driver [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.871 2 DEBUG nova.virt.libvirt.vif [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:42:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1118148026',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1118148026',id=34,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:42:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cfed6615d64e404ab1542b20621438d9',ramdisk_id='',reservation_id='r-exwj766f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-2031848124',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-2031848124-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:42:39Z,user_data=None,user_id='ab2d5dc08c96417b93ba3fc03cddf0cd',uuid=fbaabbd4-48b4-4f5e-a7e6-e71da0917c01,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01051fd6-1d94-4199-bdca-cc108fc67855", "address": "fa:16:3e:ed:8b:0c", "network": {"id": "42a09407-34b6-42b5-8fee-510c4d23f792", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-29438232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e0d4bcf8c1c401bb76039b2d2845a9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap01051fd6-1d", "ovs_interfaceid": "01051fd6-1d94-4199-bdca-cc108fc67855", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.872 2 DEBUG nova.network.os_vif_util [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "01051fd6-1d94-4199-bdca-cc108fc67855", "address": "fa:16:3e:ed:8b:0c", "network": {"id": "42a09407-34b6-42b5-8fee-510c4d23f792", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-29438232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e0d4bcf8c1c401bb76039b2d2845a9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap01051fd6-1d", "ovs_interfaceid": "01051fd6-1d94-4199-bdca-cc108fc67855", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.872 2 DEBUG nova.network.os_vif_util [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:8b:0c,bridge_name='br-int',has_traffic_filtering=True,id=01051fd6-1d94-4199-bdca-cc108fc67855,network=Network(42a09407-34b6-42b5-8fee-510c4d23f792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01051fd6-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.873 2 DEBUG os_vif [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:8b:0c,bridge_name='br-int',has_traffic_filtering=True,id=01051fd6-1d94-4199-bdca-cc108fc67855,network=Network(42a09407-34b6-42b5-8fee-510c4d23f792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01051fd6-1d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.874 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.874 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.877 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01051fd6-1d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.877 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap01051fd6-1d, col_values=(('external_ids', {'iface-id': '01051fd6-1d94-4199-bdca-cc108fc67855', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ed:8b:0c', 'vm-uuid': 'fbaabbd4-48b4-4f5e-a7e6-e71da0917c01'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:39 compute-0 NetworkManager[51654]: <info>  [1759394619.8804] manager: (tap01051fd6-1d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.890 2 INFO os_vif [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:8b:0c,bridge_name='br-int',has_traffic_filtering=True,id=01051fd6-1d94-4199-bdca-cc108fc67855,network=Network(42a09407-34b6-42b5-8fee-510c4d23f792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01051fd6-1d')
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.892 2 DEBUG nova.virt.libvirt.driver [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Oct 02 08:43:39 compute-0 nova_compute[192567]: 2025-10-02 08:43:39.892 2 DEBUG nova.compute.manager [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpz81wlw30',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fbaabbd4-48b4-4f5e-a7e6-e71da0917c01',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Oct 02 08:43:40 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:40.681 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:43:40 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:40.682 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:43:40 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:40.683 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:43:40 compute-0 nova_compute[192567]: 2025-10-02 08:43:40.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:41 compute-0 podman[228071]: 2025-10-02 08:43:41.174253413 +0000 UTC m=+0.081228110 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct 02 08:43:41 compute-0 podman[228073]: 2025-10-02 08:43:41.207785997 +0000 UTC m=+0.097541038 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 02 08:43:41 compute-0 podman[228075]: 2025-10-02 08:43:41.20978945 +0000 UTC m=+0.094491234 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 02 08:43:41 compute-0 podman[228072]: 2025-10-02 08:43:41.235013475 +0000 UTC m=+0.129111421 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 02 08:43:42 compute-0 nova_compute[192567]: 2025-10-02 08:43:42.024 2 DEBUG nova.network.neutron [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Port 01051fd6-1d94-4199-bdca-cc108fc67855 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Oct 02 08:43:42 compute-0 nova_compute[192567]: 2025-10-02 08:43:42.026 2 DEBUG nova.compute.manager [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpz81wlw30',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fbaabbd4-48b4-4f5e-a7e6-e71da0917c01',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Oct 02 08:43:42 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 02 08:43:42 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 02 08:43:42 compute-0 kernel: tap01051fd6-1d: entered promiscuous mode
Oct 02 08:43:42 compute-0 NetworkManager[51654]: <info>  [1759394622.4319] manager: (tap01051fd6-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/94)
Oct 02 08:43:42 compute-0 ovn_controller[94821]: 2025-10-02T08:43:42Z|00243|binding|INFO|Claiming lport 01051fd6-1d94-4199-bdca-cc108fc67855 for this additional chassis.
Oct 02 08:43:42 compute-0 ovn_controller[94821]: 2025-10-02T08:43:42Z|00244|binding|INFO|01051fd6-1d94-4199-bdca-cc108fc67855: Claiming fa:16:3e:ed:8b:0c 10.100.0.4
Oct 02 08:43:42 compute-0 nova_compute[192567]: 2025-10-02 08:43:42.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:42 compute-0 ovn_controller[94821]: 2025-10-02T08:43:42Z|00245|binding|INFO|Setting lport 01051fd6-1d94-4199-bdca-cc108fc67855 ovn-installed in OVS
Oct 02 08:43:42 compute-0 nova_compute[192567]: 2025-10-02 08:43:42.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:42 compute-0 nova_compute[192567]: 2025-10-02 08:43:42.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:42 compute-0 systemd-udevd[228185]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:43:42 compute-0 systemd-machined[152597]: New machine qemu-23-instance-00000022.
Oct 02 08:43:42 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-00000022.
Oct 02 08:43:42 compute-0 NetworkManager[51654]: <info>  [1759394622.5109] device (tap01051fd6-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:43:42 compute-0 NetworkManager[51654]: <info>  [1759394622.5131] device (tap01051fd6-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:43:43 compute-0 nova_compute[192567]: 2025-10-02 08:43:43.510 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394623.5102425, fbaabbd4-48b4-4f5e-a7e6-e71da0917c01 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:43:43 compute-0 nova_compute[192567]: 2025-10-02 08:43:43.512 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] VM Started (Lifecycle Event)
Oct 02 08:43:43 compute-0 nova_compute[192567]: 2025-10-02 08:43:43.541 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:43:44 compute-0 nova_compute[192567]: 2025-10-02 08:43:44.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:44 compute-0 nova_compute[192567]: 2025-10-02 08:43:44.424 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394624.4236221, fbaabbd4-48b4-4f5e-a7e6-e71da0917c01 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:43:44 compute-0 nova_compute[192567]: 2025-10-02 08:43:44.424 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] VM Resumed (Lifecycle Event)
Oct 02 08:43:44 compute-0 nova_compute[192567]: 2025-10-02 08:43:44.446 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:43:44 compute-0 nova_compute[192567]: 2025-10-02 08:43:44.452 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:43:44 compute-0 nova_compute[192567]: 2025-10-02 08:43:44.481 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Oct 02 08:43:44 compute-0 nova_compute[192567]: 2025-10-02 08:43:44.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.008 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.009 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.009 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:43:46 compute-0 podman[228217]: 2025-10-02 08:43:46.192817296 +0000 UTC m=+0.090591911 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:43:46 compute-0 ovn_controller[94821]: 2025-10-02T08:43:46Z|00246|binding|INFO|Claiming lport 01051fd6-1d94-4199-bdca-cc108fc67855 for this chassis.
Oct 02 08:43:46 compute-0 ovn_controller[94821]: 2025-10-02T08:43:46Z|00247|binding|INFO|01051fd6-1d94-4199-bdca-cc108fc67855: Claiming fa:16:3e:ed:8b:0c 10.100.0.4
Oct 02 08:43:46 compute-0 ovn_controller[94821]: 2025-10-02T08:43:46Z|00248|binding|INFO|Setting lport 01051fd6-1d94-4199-bdca-cc108fc67855 up in Southbound
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.799 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:8b:0c 10.100.0.4'], port_security=['fa:16:3e:ed:8b:0c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'fbaabbd4-48b4-4f5e-a7e6-e71da0917c01', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a09407-34b6-42b5-8fee-510c4d23f792', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cfed6615d64e404ab1542b20621438d9', 'neutron:revision_number': '11', 'neutron:security_group_ids': '7a657333-92c9-49e2-9326-8e87ae1eae40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=177cfe6a-1924-46ab-8ec3-256ed9c4e2cc, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=01051fd6-1d94-4199-bdca-cc108fc67855) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.801 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 01051fd6-1d94-4199-bdca-cc108fc67855 in datapath 42a09407-34b6-42b5-8fee-510c4d23f792 bound to our chassis
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.804 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 42a09407-34b6-42b5-8fee-510c4d23f792
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.826 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[99ec76ae-310a-42c4-a342-aaac360f96be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.828 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap42a09407-31 in ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.831 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap42a09407-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.832 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[cd76b11b-aa4e-4a1a-841b-70ac1d2d31a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.833 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f281eaee-d7d1-4a3b-ab79-047156311571]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.855 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[6afb3ebd-026c-4a29-9276-b22164068f1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.898 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[4bcc697f-3af8-4fe9-b639-044a744f416c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.949 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[2fcbd5b8-5aa2-4925-935d-5986c5bb459e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:46.960 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[9b2df01c-d74e-47e1-8588-a9cd127f088d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:46 compute-0 NetworkManager[51654]: <info>  [1759394626.9645] manager: (tap42a09407-30): new Veth device (/org/freedesktop/NetworkManager/Devices/95)
Oct 02 08:43:46 compute-0 systemd-udevd[228248]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:43:47 compute-0 nova_compute[192567]: 2025-10-02 08:43:47.023 2 INFO nova.compute.manager [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Post operation of migration started
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.027 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[47b13de7-4c8f-424e-9819-2221d4bf863f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.040 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[f7218b8e-bfbc-46bb-8b97-c0193477539c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:47 compute-0 NetworkManager[51654]: <info>  [1759394627.0755] device (tap42a09407-30): carrier: link connected
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.089 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[8808d61a-a419-412e-84f2-d44ed0c76c56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.120 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[1a97eb47-c064-4c98-bc82-2c376f359cf9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42a09407-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:f2:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551466, 'reachable_time': 43740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228268, 'error': None, 'target': 'ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.148 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[bb538d38-a75c-491f-970a-a480c21ab61e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe15:f213'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 551466, 'tstamp': 551466}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228269, 'error': None, 'target': 'ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.175 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[fe00f4d8-b7f2-41c3-89da-c9285c5f3be0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42a09407-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:15:f2:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551466, 'reachable_time': 43740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228270, 'error': None, 'target': 'ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.222 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[df269485-ca88-41c6-8426-888ae71feced]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.330 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb5b2f5-4e3f-4dbe-87fe-59ab1459fa0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.333 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42a09407-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.334 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.335 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap42a09407-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:43:47 compute-0 NetworkManager[51654]: <info>  [1759394627.3905] manager: (tap42a09407-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Oct 02 08:43:47 compute-0 kernel: tap42a09407-30: entered promiscuous mode
Oct 02 08:43:47 compute-0 nova_compute[192567]: 2025-10-02 08:43:47.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.398 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap42a09407-30, col_values=(('external_ids', {'iface-id': '07a47f89-e193-4dc6-986d-f5fa01a04e07'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:43:47 compute-0 ovn_controller[94821]: 2025-10-02T08:43:47Z|00249|binding|INFO|Releasing lport 07a47f89-e193-4dc6-986d-f5fa01a04e07 from this chassis (sb_readonly=0)
Oct 02 08:43:47 compute-0 nova_compute[192567]: 2025-10-02 08:43:47.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.404 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/42a09407-34b6-42b5-8fee-510c4d23f792.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/42a09407-34b6-42b5-8fee-510c4d23f792.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.405 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8c9003d7-010b-481f-a781-aea4a78b3805]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.406 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-42a09407-34b6-42b5-8fee-510c4d23f792
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/42a09407-34b6-42b5-8fee-510c4d23f792.pid.haproxy
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID 42a09407-34b6-42b5-8fee-510c4d23f792
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:43:47 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:47.408 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792', 'env', 'PROCESS_TAG=haproxy-42a09407-34b6-42b5-8fee-510c4d23f792', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/42a09407-34b6-42b5-8fee-510c4d23f792.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:43:47 compute-0 nova_compute[192567]: 2025-10-02 08:43:47.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:47 compute-0 nova_compute[192567]: 2025-10-02 08:43:47.723 2 DEBUG oslo_concurrency.lockutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-fbaabbd4-48b4-4f5e-a7e6-e71da0917c01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:43:47 compute-0 nova_compute[192567]: 2025-10-02 08:43:47.724 2 DEBUG oslo_concurrency.lockutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-fbaabbd4-48b4-4f5e-a7e6-e71da0917c01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:43:47 compute-0 nova_compute[192567]: 2025-10-02 08:43:47.726 2 DEBUG nova.network.neutron [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:43:47 compute-0 podman[228303]: 2025-10-02 08:43:47.89531067 +0000 UTC m=+0.082225401 container create 6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001)
Oct 02 08:43:47 compute-0 systemd[1]: Started libpod-conmon-6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0.scope.
Oct 02 08:43:47 compute-0 podman[228303]: 2025-10-02 08:43:47.858104952 +0000 UTC m=+0.045019773 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:43:47 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:43:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eeb2f3ad8de73827dbf540365c25ab7278705892d397a770e9efa57b82f4041a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:43:47 compute-0 podman[228303]: 2025-10-02 08:43:47.998912676 +0000 UTC m=+0.185827507 container init 6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:43:48 compute-0 podman[228303]: 2025-10-02 08:43:48.003772627 +0000 UTC m=+0.190687398 container start 6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001)
Oct 02 08:43:48 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[228318]: [NOTICE]   (228322) : New worker (228324) forked
Oct 02 08:43:48 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[228318]: [NOTICE]   (228322) : Loading success.
Oct 02 08:43:49 compute-0 nova_compute[192567]: 2025-10-02 08:43:49.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:49 compute-0 nova_compute[192567]: 2025-10-02 08:43:49.758 2 DEBUG nova.network.neutron [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Updating instance_info_cache with network_info: [{"id": "01051fd6-1d94-4199-bdca-cc108fc67855", "address": "fa:16:3e:ed:8b:0c", "network": {"id": "42a09407-34b6-42b5-8fee-510c4d23f792", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-29438232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e0d4bcf8c1c401bb76039b2d2845a9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01051fd6-1d", "ovs_interfaceid": "01051fd6-1d94-4199-bdca-cc108fc67855", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:43:49 compute-0 nova_compute[192567]: 2025-10-02 08:43:49.777 2 DEBUG oslo_concurrency.lockutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-fbaabbd4-48b4-4f5e-a7e6-e71da0917c01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:43:49 compute-0 nova_compute[192567]: 2025-10-02 08:43:49.796 2 DEBUG oslo_concurrency.lockutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:43:49 compute-0 nova_compute[192567]: 2025-10-02 08:43:49.796 2 DEBUG oslo_concurrency.lockutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:43:49 compute-0 nova_compute[192567]: 2025-10-02 08:43:49.797 2 DEBUG oslo_concurrency.lockutils [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:43:49 compute-0 nova_compute[192567]: 2025-10-02 08:43:49.802 2 INFO nova.virt.libvirt.driver [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 02 08:43:49 compute-0 virtqemud[192112]: Domain id=23 name='instance-00000022' uuid=fbaabbd4-48b4-4f5e-a7e6-e71da0917c01 is tainted: custom-monitor
Oct 02 08:43:49 compute-0 nova_compute[192567]: 2025-10-02 08:43:49.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:50 compute-0 nova_compute[192567]: 2025-10-02 08:43:50.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:43:50 compute-0 nova_compute[192567]: 2025-10-02 08:43:50.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:43:50 compute-0 nova_compute[192567]: 2025-10-02 08:43:50.623 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:43:50 compute-0 nova_compute[192567]: 2025-10-02 08:43:50.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:43:50 compute-0 nova_compute[192567]: 2025-10-02 08:43:50.655 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:43:50 compute-0 nova_compute[192567]: 2025-10-02 08:43:50.813 2 INFO nova.virt.libvirt.driver [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 02 08:43:51 compute-0 nova_compute[192567]: 2025-10-02 08:43:51.822 2 INFO nova.virt.libvirt.driver [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 02 08:43:51 compute-0 nova_compute[192567]: 2025-10-02 08:43:51.829 2 DEBUG nova.compute.manager [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:43:51 compute-0 nova_compute[192567]: 2025-10-02 08:43:51.857 2 DEBUG nova.objects.instance [None req-4580d232-4964-4684-9a43-0dd25f5adef6 f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Oct 02 08:43:54 compute-0 nova_compute[192567]: 2025-10-02 08:43:54.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:54 compute-0 nova_compute[192567]: 2025-10-02 08:43:54.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:43:54 compute-0 nova_compute[192567]: 2025-10-02 08:43:54.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:43:54 compute-0 nova_compute[192567]: 2025-10-02 08:43:54.655 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:43:54 compute-0 nova_compute[192567]: 2025-10-02 08:43:54.656 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:43:54 compute-0 nova_compute[192567]: 2025-10-02 08:43:54.656 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:43:54 compute-0 nova_compute[192567]: 2025-10-02 08:43:54.657 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:43:54 compute-0 nova_compute[192567]: 2025-10-02 08:43:54.770 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:43:54 compute-0 nova_compute[192567]: 2025-10-02 08:43:54.872 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:43:54 compute-0 nova_compute[192567]: 2025-10-02 08:43:54.873 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:43:54 compute-0 nova_compute[192567]: 2025-10-02 08:43:54.952 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:43:54 compute-0 nova_compute[192567]: 2025-10-02 08:43:54.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:55 compute-0 nova_compute[192567]: 2025-10-02 08:43:55.137 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:43:55 compute-0 nova_compute[192567]: 2025-10-02 08:43:55.139 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5652MB free_disk=73.43292999267578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:43:55 compute-0 nova_compute[192567]: 2025-10-02 08:43:55.139 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:43:55 compute-0 nova_compute[192567]: 2025-10-02 08:43:55.140 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:43:55 compute-0 nova_compute[192567]: 2025-10-02 08:43:55.221 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance fbaabbd4-48b4-4f5e-a7e6-e71da0917c01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:43:55 compute-0 nova_compute[192567]: 2025-10-02 08:43:55.222 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:43:55 compute-0 nova_compute[192567]: 2025-10-02 08:43:55.222 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:43:55 compute-0 nova_compute[192567]: 2025-10-02 08:43:55.357 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:43:55 compute-0 nova_compute[192567]: 2025-10-02 08:43:55.378 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:43:55 compute-0 nova_compute[192567]: 2025-10-02 08:43:55.416 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:43:55 compute-0 nova_compute[192567]: 2025-10-02 08:43:55.416 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.278 2 DEBUG oslo_concurrency.lockutils [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Acquiring lock "fbaabbd4-48b4-4f5e-a7e6-e71da0917c01" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.278 2 DEBUG oslo_concurrency.lockutils [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lock "fbaabbd4-48b4-4f5e-a7e6-e71da0917c01" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.278 2 DEBUG oslo_concurrency.lockutils [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Acquiring lock "fbaabbd4-48b4-4f5e-a7e6-e71da0917c01-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.279 2 DEBUG oslo_concurrency.lockutils [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lock "fbaabbd4-48b4-4f5e-a7e6-e71da0917c01-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.279 2 DEBUG oslo_concurrency.lockutils [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lock "fbaabbd4-48b4-4f5e-a7e6-e71da0917c01-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.280 2 INFO nova.compute.manager [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Terminating instance
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.281 2 DEBUG nova.compute.manager [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:43:56 compute-0 kernel: tap01051fd6-1d (unregistering): left promiscuous mode
Oct 02 08:43:56 compute-0 NetworkManager[51654]: <info>  [1759394636.3090] device (tap01051fd6-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:43:56 compute-0 ovn_controller[94821]: 2025-10-02T08:43:56Z|00250|binding|INFO|Releasing lport 01051fd6-1d94-4199-bdca-cc108fc67855 from this chassis (sb_readonly=0)
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:56 compute-0 ovn_controller[94821]: 2025-10-02T08:43:56Z|00251|binding|INFO|Setting lport 01051fd6-1d94-4199-bdca-cc108fc67855 down in Southbound
Oct 02 08:43:56 compute-0 ovn_controller[94821]: 2025-10-02T08:43:56Z|00252|binding|INFO|Removing iface tap01051fd6-1d ovn-installed in OVS
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.378 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:8b:0c 10.100.0.4'], port_security=['fa:16:3e:ed:8b:0c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'fbaabbd4-48b4-4f5e-a7e6-e71da0917c01', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a09407-34b6-42b5-8fee-510c4d23f792', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cfed6615d64e404ab1542b20621438d9', 'neutron:revision_number': '13', 'neutron:security_group_ids': '7a657333-92c9-49e2-9326-8e87ae1eae40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=177cfe6a-1924-46ab-8ec3-256ed9c4e2cc, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=01051fd6-1d94-4199-bdca-cc108fc67855) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.380 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 01051fd6-1d94-4199-bdca-cc108fc67855 in datapath 42a09407-34b6-42b5-8fee-510c4d23f792 unbound from our chassis
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.382 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42a09407-34b6-42b5-8fee-510c4d23f792, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.383 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[b39d8d06-b760-4b7b-bca3-58bf31009464]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.384 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792 namespace which is not needed anymore
Oct 02 08:43:56 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000022.scope: Deactivated successfully.
Oct 02 08:43:56 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000022.scope: Consumed 2.217s CPU time.
Oct 02 08:43:56 compute-0 systemd-machined[152597]: Machine qemu-23-instance-00000022 terminated.
Oct 02 08:43:56 compute-0 podman[228341]: 2025-10-02 08:43:56.475414365 +0000 UTC m=+0.090905991 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.559 2 INFO nova.virt.libvirt.driver [-] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Instance destroyed successfully.
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.560 2 DEBUG nova.objects.instance [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lazy-loading 'resources' on Instance uuid fbaabbd4-48b4-4f5e-a7e6-e71da0917c01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:43:56 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[228318]: [NOTICE]   (228322) : haproxy version is 2.8.14-c23fe91
Oct 02 08:43:56 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[228318]: [NOTICE]   (228322) : path to executable is /usr/sbin/haproxy
Oct 02 08:43:56 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[228318]: [ALERT]    (228322) : Current worker (228324) exited with code 143 (Terminated)
Oct 02 08:43:56 compute-0 neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792[228318]: [WARNING]  (228322) : All workers exited. Exiting... (0)
Oct 02 08:43:56 compute-0 systemd[1]: libpod-6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0.scope: Deactivated successfully.
Oct 02 08:43:56 compute-0 podman[228384]: 2025-10-02 08:43:56.572423705 +0000 UTC m=+0.054915590 container died 6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.596 2 DEBUG nova.virt.libvirt.vif [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T08:42:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1118148026',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1118148026',id=34,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:42:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cfed6615d64e404ab1542b20621438d9',ramdisk_id='',reservation_id='r-exwj766f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',clean_attempts='1',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-2031848124',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-2031848124-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:43:51Z,user_data=None,user_id='ab2d5dc08c96417b93ba3fc03cddf0cd',uuid=fbaabbd4-48b4-4f5e-a7e6-e71da0917c01,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01051fd6-1d94-4199-bdca-cc108fc67855", "address": "fa:16:3e:ed:8b:0c", "network": {"id": "42a09407-34b6-42b5-8fee-510c4d23f792", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-29438232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e0d4bcf8c1c401bb76039b2d2845a9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01051fd6-1d", "ovs_interfaceid": "01051fd6-1d94-4199-bdca-cc108fc67855", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.597 2 DEBUG nova.network.os_vif_util [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Converting VIF {"id": "01051fd6-1d94-4199-bdca-cc108fc67855", "address": "fa:16:3e:ed:8b:0c", "network": {"id": "42a09407-34b6-42b5-8fee-510c4d23f792", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-29438232-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4e0d4bcf8c1c401bb76039b2d2845a9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01051fd6-1d", "ovs_interfaceid": "01051fd6-1d94-4199-bdca-cc108fc67855", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.599 2 DEBUG nova.network.os_vif_util [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ed:8b:0c,bridge_name='br-int',has_traffic_filtering=True,id=01051fd6-1d94-4199-bdca-cc108fc67855,network=Network(42a09407-34b6-42b5-8fee-510c4d23f792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01051fd6-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.600 2 DEBUG os_vif [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ed:8b:0c,bridge_name='br-int',has_traffic_filtering=True,id=01051fd6-1d94-4199-bdca-cc108fc67855,network=Network(42a09407-34b6-42b5-8fee-510c4d23f792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01051fd6-1d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.604 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01051fd6-1d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0-userdata-shm.mount: Deactivated successfully.
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:43:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-eeb2f3ad8de73827dbf540365c25ab7278705892d397a770e9efa57b82f4041a-merged.mount: Deactivated successfully.
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.616 2 INFO os_vif [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ed:8b:0c,bridge_name='br-int',has_traffic_filtering=True,id=01051fd6-1d94-4199-bdca-cc108fc67855,network=Network(42a09407-34b6-42b5-8fee-510c4d23f792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01051fd6-1d')
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.618 2 INFO nova.virt.libvirt.driver [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Deleting instance files /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01_del
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.619 2 INFO nova.virt.libvirt.driver [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Deletion of /var/lib/nova/instances/fbaabbd4-48b4-4f5e-a7e6-e71da0917c01_del complete
Oct 02 08:43:56 compute-0 podman[228384]: 2025-10-02 08:43:56.63037423 +0000 UTC m=+0.112866125 container cleanup 6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:43:56 compute-0 systemd[1]: libpod-conmon-6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0.scope: Deactivated successfully.
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.664 2 INFO nova.compute.manager [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Took 0.38 seconds to destroy the instance on the hypervisor.
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.665 2 DEBUG oslo.service.loopingcall [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.665 2 DEBUG nova.compute.manager [-] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.666 2 DEBUG nova.network.neutron [-] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:43:56 compute-0 podman[228429]: 2025-10-02 08:43:56.709005408 +0000 UTC m=+0.051810084 container remove 6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001)
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.717 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f9fbef71-54f2-41c5-85ae-2f4480dcc726]: (4, ('Thu Oct  2 08:43:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792 (6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0)\n6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0\nThu Oct  2 08:43:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792 (6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0)\n6b265e7473af2a6177b670e7c883ab05148f422aee4b5967730b8689584f63c0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.719 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[5dd46d53-9414-4885-be9e-3c7160290088]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.720 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42a09407-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:56 compute-0 kernel: tap42a09407-30: left promiscuous mode
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.730 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[900ce5de-cad6-4c6b-aaf7-148bce34ca0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.780 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f4722bfb-3825-4674-a042-289bb807de5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.781 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[04fc66dd-6031-42ad-ba66-52d2bbfb8d73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.798 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[15a82c9e-c1ce-440e-b3b8-f84594822451]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551452, 'reachable_time': 39095, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228445, 'error': None, 'target': 'ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.800 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-42a09407-34b6-42b5-8fee-510c4d23f792 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:43:56 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:43:56.800 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[a842955a-2105-4c78-a7e0-e1f49beee451]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:43:56 compute-0 systemd[1]: run-netns-ovnmeta\x2d42a09407\x2d34b6\x2d42b5\x2d8fee\x2d510c4d23f792.mount: Deactivated successfully.
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.925 2 DEBUG nova.compute.manager [req-9e64715b-29e3-4ba4-9866-84616fc5ec59 req-a3b00548-ffc0-4786-81d8-87fbfec03560 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Received event network-vif-unplugged-01051fd6-1d94-4199-bdca-cc108fc67855 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.926 2 DEBUG oslo_concurrency.lockutils [req-9e64715b-29e3-4ba4-9866-84616fc5ec59 req-a3b00548-ffc0-4786-81d8-87fbfec03560 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "fbaabbd4-48b4-4f5e-a7e6-e71da0917c01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.926 2 DEBUG oslo_concurrency.lockutils [req-9e64715b-29e3-4ba4-9866-84616fc5ec59 req-a3b00548-ffc0-4786-81d8-87fbfec03560 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "fbaabbd4-48b4-4f5e-a7e6-e71da0917c01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.927 2 DEBUG oslo_concurrency.lockutils [req-9e64715b-29e3-4ba4-9866-84616fc5ec59 req-a3b00548-ffc0-4786-81d8-87fbfec03560 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "fbaabbd4-48b4-4f5e-a7e6-e71da0917c01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.927 2 DEBUG nova.compute.manager [req-9e64715b-29e3-4ba4-9866-84616fc5ec59 req-a3b00548-ffc0-4786-81d8-87fbfec03560 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] No waiting events found dispatching network-vif-unplugged-01051fd6-1d94-4199-bdca-cc108fc67855 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:43:56 compute-0 nova_compute[192567]: 2025-10-02 08:43:56.927 2 DEBUG nova.compute.manager [req-9e64715b-29e3-4ba4-9866-84616fc5ec59 req-a3b00548-ffc0-4786-81d8-87fbfec03560 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Received event network-vif-unplugged-01051fd6-1d94-4199-bdca-cc108fc67855 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:43:57 compute-0 nova_compute[192567]: 2025-10-02 08:43:57.187 2 DEBUG nova.network.neutron [-] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:43:57 compute-0 nova_compute[192567]: 2025-10-02 08:43:57.204 2 INFO nova.compute.manager [-] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Took 0.54 seconds to deallocate network for instance.
Oct 02 08:43:57 compute-0 nova_compute[192567]: 2025-10-02 08:43:57.243 2 DEBUG oslo_concurrency.lockutils [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:43:57 compute-0 nova_compute[192567]: 2025-10-02 08:43:57.244 2 DEBUG oslo_concurrency.lockutils [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:43:57 compute-0 nova_compute[192567]: 2025-10-02 08:43:57.317 2 DEBUG nova.compute.provider_tree [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:43:57 compute-0 nova_compute[192567]: 2025-10-02 08:43:57.333 2 DEBUG nova.scheduler.client.report [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:43:57 compute-0 nova_compute[192567]: 2025-10-02 08:43:57.356 2 DEBUG oslo_concurrency.lockutils [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:43:57 compute-0 nova_compute[192567]: 2025-10-02 08:43:57.383 2 INFO nova.scheduler.client.report [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Deleted allocations for instance fbaabbd4-48b4-4f5e-a7e6-e71da0917c01
Oct 02 08:43:57 compute-0 nova_compute[192567]: 2025-10-02 08:43:57.490 2 DEBUG oslo_concurrency.lockutils [None req-1cfc5d25-d26d-4068-8a03-081ebc5dd1fb ab2d5dc08c96417b93ba3fc03cddf0cd cfed6615d64e404ab1542b20621438d9 - - default default] Lock "fbaabbd4-48b4-4f5e-a7e6-e71da0917c01" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:43:58 compute-0 nova_compute[192567]: 2025-10-02 08:43:58.418 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:43:58 compute-0 nova_compute[192567]: 2025-10-02 08:43:58.418 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:43:58 compute-0 nova_compute[192567]: 2025-10-02 08:43:58.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:43:59 compute-0 nova_compute[192567]: 2025-10-02 08:43:59.042 2 DEBUG nova.compute.manager [req-9b38a9ff-3bab-44a1-9d97-2b93d138d8bd req-d4dedc2b-e6b8-4868-937c-0704515df5af 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Received event network-vif-plugged-01051fd6-1d94-4199-bdca-cc108fc67855 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:43:59 compute-0 nova_compute[192567]: 2025-10-02 08:43:59.043 2 DEBUG oslo_concurrency.lockutils [req-9b38a9ff-3bab-44a1-9d97-2b93d138d8bd req-d4dedc2b-e6b8-4868-937c-0704515df5af 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "fbaabbd4-48b4-4f5e-a7e6-e71da0917c01-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:43:59 compute-0 nova_compute[192567]: 2025-10-02 08:43:59.043 2 DEBUG oslo_concurrency.lockutils [req-9b38a9ff-3bab-44a1-9d97-2b93d138d8bd req-d4dedc2b-e6b8-4868-937c-0704515df5af 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "fbaabbd4-48b4-4f5e-a7e6-e71da0917c01-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:43:59 compute-0 nova_compute[192567]: 2025-10-02 08:43:59.044 2 DEBUG oslo_concurrency.lockutils [req-9b38a9ff-3bab-44a1-9d97-2b93d138d8bd req-d4dedc2b-e6b8-4868-937c-0704515df5af 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "fbaabbd4-48b4-4f5e-a7e6-e71da0917c01-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:43:59 compute-0 nova_compute[192567]: 2025-10-02 08:43:59.044 2 DEBUG nova.compute.manager [req-9b38a9ff-3bab-44a1-9d97-2b93d138d8bd req-d4dedc2b-e6b8-4868-937c-0704515df5af 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] No waiting events found dispatching network-vif-plugged-01051fd6-1d94-4199-bdca-cc108fc67855 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:43:59 compute-0 nova_compute[192567]: 2025-10-02 08:43:59.045 2 WARNING nova.compute.manager [req-9b38a9ff-3bab-44a1-9d97-2b93d138d8bd req-d4dedc2b-e6b8-4868-937c-0704515df5af 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Received unexpected event network-vif-plugged-01051fd6-1d94-4199-bdca-cc108fc67855 for instance with vm_state deleted and task_state None.
Oct 02 08:43:59 compute-0 nova_compute[192567]: 2025-10-02 08:43:59.045 2 DEBUG nova.compute.manager [req-9b38a9ff-3bab-44a1-9d97-2b93d138d8bd req-d4dedc2b-e6b8-4868-937c-0704515df5af 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Received event network-vif-deleted-01051fd6-1d94-4199-bdca-cc108fc67855 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:43:59 compute-0 nova_compute[192567]: 2025-10-02 08:43:59.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:43:59 compute-0 nova_compute[192567]: 2025-10-02 08:43:59.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:43:59 compute-0 podman[203011]: time="2025-10-02T08:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:43:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:43:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3009 "" "Go-http-client/1.1"
Oct 02 08:44:01 compute-0 openstack_network_exporter[205118]: ERROR   08:44:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:44:01 compute-0 openstack_network_exporter[205118]: ERROR   08:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:44:01 compute-0 openstack_network_exporter[205118]: ERROR   08:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:44:01 compute-0 openstack_network_exporter[205118]: ERROR   08:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:44:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:44:01 compute-0 openstack_network_exporter[205118]: ERROR   08:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:44:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:44:01 compute-0 nova_compute[192567]: 2025-10-02 08:44:01.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:04 compute-0 nova_compute[192567]: 2025-10-02 08:44:04.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:04 compute-0 nova_compute[192567]: 2025-10-02 08:44:04.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:44:04 compute-0 nova_compute[192567]: 2025-10-02 08:44:04.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:44:06 compute-0 nova_compute[192567]: 2025-10-02 08:44:06.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:08 compute-0 nova_compute[192567]: 2025-10-02 08:44:08.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:44:09 compute-0 nova_compute[192567]: 2025-10-02 08:44:09.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:11 compute-0 nova_compute[192567]: 2025-10-02 08:44:11.557 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759394636.5552375, fbaabbd4-48b4-4f5e-a7e6-e71da0917c01 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:44:11 compute-0 nova_compute[192567]: 2025-10-02 08:44:11.558 2 INFO nova.compute.manager [-] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] VM Stopped (Lifecycle Event)
Oct 02 08:44:11 compute-0 nova_compute[192567]: 2025-10-02 08:44:11.597 2 DEBUG nova.compute.manager [None req-10b68a04-4926-4465-80ff-84d1e21a2f48 - - - - - -] [instance: fbaabbd4-48b4-4f5e-a7e6-e71da0917c01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:44:11 compute-0 nova_compute[192567]: 2025-10-02 08:44:11.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:12 compute-0 podman[228446]: 2025-10-02 08:44:12.190498816 +0000 UTC m=+0.096718482 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 02 08:44:12 compute-0 podman[228448]: 2025-10-02 08:44:12.242586608 +0000 UTC m=+0.131246997 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd)
Oct 02 08:44:12 compute-0 podman[228449]: 2025-10-02 08:44:12.242665941 +0000 UTC m=+0.129581156 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 02 08:44:12 compute-0 podman[228447]: 2025-10-02 08:44:12.272442937 +0000 UTC m=+0.167085483 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct 02 08:44:14 compute-0 nova_compute[192567]: 2025-10-02 08:44:14.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:16 compute-0 nova_compute[192567]: 2025-10-02 08:44:16.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:17 compute-0 podman[228529]: 2025-10-02 08:44:17.183792743 +0000 UTC m=+0.087686271 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:44:19 compute-0 nova_compute[192567]: 2025-10-02 08:44:19.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:21 compute-0 nova_compute[192567]: 2025-10-02 08:44:21.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:24 compute-0 nova_compute[192567]: 2025-10-02 08:44:24.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:26 compute-0 nova_compute[192567]: 2025-10-02 08:44:26.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:27 compute-0 podman[228553]: 2025-10-02 08:44:27.20544616 +0000 UTC m=+0.101469150 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, config_id=edpm)
Oct 02 08:44:29 compute-0 nova_compute[192567]: 2025-10-02 08:44:29.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:29 compute-0 nova_compute[192567]: 2025-10-02 08:44:29.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:29 compute-0 podman[203011]: time="2025-10-02T08:44:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:44:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:44:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:44:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:44:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 02 08:44:31 compute-0 openstack_network_exporter[205118]: ERROR   08:44:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:44:31 compute-0 openstack_network_exporter[205118]: ERROR   08:44:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:44:31 compute-0 openstack_network_exporter[205118]: ERROR   08:44:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:44:31 compute-0 openstack_network_exporter[205118]: ERROR   08:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:44:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:44:31 compute-0 openstack_network_exporter[205118]: ERROR   08:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:44:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:44:31 compute-0 nova_compute[192567]: 2025-10-02 08:44:31.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:34 compute-0 nova_compute[192567]: 2025-10-02 08:44:34.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:36 compute-0 nova_compute[192567]: 2025-10-02 08:44:36.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:39 compute-0 nova_compute[192567]: 2025-10-02 08:44:39.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:41 compute-0 nova_compute[192567]: 2025-10-02 08:44:41.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:43 compute-0 podman[228575]: 2025-10-02 08:44:43.18966154 +0000 UTC m=+0.092446009 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Oct 02 08:44:43 compute-0 podman[228577]: 2025-10-02 08:44:43.221541932 +0000 UTC m=+0.116336743 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:44:43 compute-0 podman[228578]: 2025-10-02 08:44:43.254409206 +0000 UTC m=+0.113023809 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 02 08:44:43 compute-0 podman[228576]: 2025-10-02 08:44:43.262609551 +0000 UTC m=+0.156791132 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:44:44 compute-0 nova_compute[192567]: 2025-10-02 08:44:44.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:44:46.010 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:44:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:44:46.010 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:44:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:44:46.011 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:44:46 compute-0 nova_compute[192567]: 2025-10-02 08:44:46.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:44:46.934 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:44:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:44:46.935 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:44:46 compute-0 nova_compute[192567]: 2025-10-02 08:44:46.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:48 compute-0 podman[228661]: 2025-10-02 08:44:48.15608481 +0000 UTC m=+0.071483616 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:44:49 compute-0 nova_compute[192567]: 2025-10-02 08:44:49.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:51 compute-0 nova_compute[192567]: 2025-10-02 08:44:51.622 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:44:51 compute-0 nova_compute[192567]: 2025-10-02 08:44:51.622 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:44:51 compute-0 nova_compute[192567]: 2025-10-02 08:44:51.622 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:44:51 compute-0 nova_compute[192567]: 2025-10-02 08:44:51.623 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:44:51 compute-0 nova_compute[192567]: 2025-10-02 08:44:51.637 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:44:51 compute-0 nova_compute[192567]: 2025-10-02 08:44:51.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:54 compute-0 nova_compute[192567]: 2025-10-02 08:44:54.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:55 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:44:55.940 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.654 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.655 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.655 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.655 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.847 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.848 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5870MB free_disk=73.46175765991211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.848 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.849 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.921 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.921 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.960 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:44:56 compute-0 nova_compute[192567]: 2025-10-02 08:44:56.985 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:44:57 compute-0 nova_compute[192567]: 2025-10-02 08:44:57.020 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:44:57 compute-0 nova_compute[192567]: 2025-10-02 08:44:57.020 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:44:58 compute-0 nova_compute[192567]: 2025-10-02 08:44:58.023 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:44:58 compute-0 nova_compute[192567]: 2025-10-02 08:44:58.024 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:44:58 compute-0 podman[228689]: 2025-10-02 08:44:58.146899367 +0000 UTC m=+0.087063292 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Oct 02 08:44:59 compute-0 nova_compute[192567]: 2025-10-02 08:44:59.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:44:59 compute-0 podman[203011]: time="2025-10-02T08:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:44:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:44:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Oct 02 08:45:00 compute-0 nova_compute[192567]: 2025-10-02 08:45:00.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:45:01 compute-0 openstack_network_exporter[205118]: ERROR   08:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:45:01 compute-0 openstack_network_exporter[205118]: ERROR   08:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:45:01 compute-0 openstack_network_exporter[205118]: ERROR   08:45:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:45:01 compute-0 openstack_network_exporter[205118]: ERROR   08:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:45:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:45:01 compute-0 openstack_network_exporter[205118]: ERROR   08:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:45:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:45:01 compute-0 nova_compute[192567]: 2025-10-02 08:45:01.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:04 compute-0 ovn_controller[94821]: 2025-10-02T08:45:04Z|00253|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct 02 08:45:04 compute-0 nova_compute[192567]: 2025-10-02 08:45:04.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:05 compute-0 unix_chkpwd[228712]: password check failed for user (root)
Oct 02 08:45:05 compute-0 sshd-session[228710]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=106.36.198.78  user=root
Oct 02 08:45:05 compute-0 nova_compute[192567]: 2025-10-02 08:45:05.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:45:05 compute-0 nova_compute[192567]: 2025-10-02 08:45:05.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:45:06 compute-0 nova_compute[192567]: 2025-10-02 08:45:06.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:07 compute-0 sshd-session[228710]: Failed password for root from 106.36.198.78 port 48438 ssh2
Oct 02 08:45:09 compute-0 sshd-session[228710]: Received disconnect from 106.36.198.78 port 48438:11:  [preauth]
Oct 02 08:45:09 compute-0 sshd-session[228710]: Disconnected from authenticating user root 106.36.198.78 port 48438 [preauth]
Oct 02 08:45:09 compute-0 nova_compute[192567]: 2025-10-02 08:45:09.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:09 compute-0 nova_compute[192567]: 2025-10-02 08:45:09.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:45:11 compute-0 nova_compute[192567]: 2025-10-02 08:45:11.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:14 compute-0 podman[228715]: 2025-10-02 08:45:14.209056984 +0000 UTC m=+0.096971171 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 02 08:45:14 compute-0 podman[228713]: 2025-10-02 08:45:14.230942605 +0000 UTC m=+0.132097844 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 02 08:45:14 compute-0 podman[228716]: 2025-10-02 08:45:14.231258365 +0000 UTC m=+0.119796341 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:45:14 compute-0 podman[228714]: 2025-10-02 08:45:14.267325417 +0000 UTC m=+0.161675714 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller)
Oct 02 08:45:14 compute-0 nova_compute[192567]: 2025-10-02 08:45:14.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:16 compute-0 nova_compute[192567]: 2025-10-02 08:45:16.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:19 compute-0 podman[228792]: 2025-10-02 08:45:19.18435476 +0000 UTC m=+0.094151733 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:45:19 compute-0 nova_compute[192567]: 2025-10-02 08:45:19.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:21 compute-0 nova_compute[192567]: 2025-10-02 08:45:21.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:24 compute-0 nova_compute[192567]: 2025-10-02 08:45:24.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:26 compute-0 nova_compute[192567]: 2025-10-02 08:45:26.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:29 compute-0 podman[228817]: 2025-10-02 08:45:29.179937184 +0000 UTC m=+0.090025473 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Oct 02 08:45:29 compute-0 nova_compute[192567]: 2025-10-02 08:45:29.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:29 compute-0 podman[203011]: time="2025-10-02T08:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:45:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:45:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 02 08:45:31 compute-0 openstack_network_exporter[205118]: ERROR   08:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:45:31 compute-0 openstack_network_exporter[205118]: ERROR   08:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:45:31 compute-0 openstack_network_exporter[205118]: ERROR   08:45:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:45:31 compute-0 openstack_network_exporter[205118]: ERROR   08:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:45:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:45:31 compute-0 openstack_network_exporter[205118]: ERROR   08:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:45:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:45:31 compute-0 nova_compute[192567]: 2025-10-02 08:45:31.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:34 compute-0 nova_compute[192567]: 2025-10-02 08:45:34.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:36 compute-0 nova_compute[192567]: 2025-10-02 08:45:36.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:39 compute-0 nova_compute[192567]: 2025-10-02 08:45:39.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:41 compute-0 nova_compute[192567]: 2025-10-02 08:45:41.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:44 compute-0 nova_compute[192567]: 2025-10-02 08:45:44.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:44 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 02 08:45:44 compute-0 podman[228838]: 2025-10-02 08:45:44.975075945 +0000 UTC m=+0.085226694 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:45:45 compute-0 podman[228841]: 2025-10-02 08:45:45.000209968 +0000 UTC m=+0.093898255 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, container_name=iscsid)
Oct 02 08:45:45 compute-0 podman[228840]: 2025-10-02 08:45:45.001555959 +0000 UTC m=+0.102754550 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd)
Oct 02 08:45:45 compute-0 podman[228839]: 2025-10-02 08:45:45.043772454 +0000 UTC m=+0.149891837 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:45:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:45:46.010 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:45:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:45:46.011 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:45:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:45:46.012 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:45:46 compute-0 nova_compute[192567]: 2025-10-02 08:45:46.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:49 compute-0 nova_compute[192567]: 2025-10-02 08:45:49.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:50 compute-0 podman[228917]: 2025-10-02 08:45:50.17803976 +0000 UTC m=+0.085741791 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:45:51 compute-0 nova_compute[192567]: 2025-10-02 08:45:51.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:45:51 compute-0 nova_compute[192567]: 2025-10-02 08:45:51.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:52 compute-0 nova_compute[192567]: 2025-10-02 08:45:52.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:45:52 compute-0 nova_compute[192567]: 2025-10-02 08:45:52.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:45:52 compute-0 nova_compute[192567]: 2025-10-02 08:45:52.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:45:52 compute-0 nova_compute[192567]: 2025-10-02 08:45:52.639 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:45:54 compute-0 nova_compute[192567]: 2025-10-02 08:45:54.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:56 compute-0 nova_compute[192567]: 2025-10-02 08:45:56.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:58 compute-0 nova_compute[192567]: 2025-10-02 08:45:58.635 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:45:58 compute-0 nova_compute[192567]: 2025-10-02 08:45:58.638 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:45:58 compute-0 nova_compute[192567]: 2025-10-02 08:45:58.639 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:45:58 compute-0 nova_compute[192567]: 2025-10-02 08:45:58.639 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:45:58 compute-0 nova_compute[192567]: 2025-10-02 08:45:58.679 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:45:58 compute-0 nova_compute[192567]: 2025-10-02 08:45:58.680 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:45:58 compute-0 nova_compute[192567]: 2025-10-02 08:45:58.681 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:45:58 compute-0 nova_compute[192567]: 2025-10-02 08:45:58.681 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:45:58 compute-0 nova_compute[192567]: 2025-10-02 08:45:58.918 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:45:58 compute-0 nova_compute[192567]: 2025-10-02 08:45:58.920 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5876MB free_disk=73.46177291870117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:45:58 compute-0 nova_compute[192567]: 2025-10-02 08:45:58.921 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:45:58 compute-0 nova_compute[192567]: 2025-10-02 08:45:58.921 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:45:59 compute-0 nova_compute[192567]: 2025-10-02 08:45:59.007 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:45:59 compute-0 nova_compute[192567]: 2025-10-02 08:45:59.008 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:45:59 compute-0 nova_compute[192567]: 2025-10-02 08:45:59.028 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing inventories for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 02 08:45:59 compute-0 nova_compute[192567]: 2025-10-02 08:45:59.052 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating ProviderTree inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 02 08:45:59 compute-0 nova_compute[192567]: 2025-10-02 08:45:59.053 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:45:59 compute-0 nova_compute[192567]: 2025-10-02 08:45:59.065 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing aggregate associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 02 08:45:59 compute-0 nova_compute[192567]: 2025-10-02 08:45:59.091 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing trait associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 02 08:45:59 compute-0 nova_compute[192567]: 2025-10-02 08:45:59.113 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:45:59 compute-0 nova_compute[192567]: 2025-10-02 08:45:59.126 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:45:59 compute-0 nova_compute[192567]: 2025-10-02 08:45:59.128 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:45:59 compute-0 nova_compute[192567]: 2025-10-02 08:45:59.128 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:45:59 compute-0 nova_compute[192567]: 2025-10-02 08:45:59.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:45:59 compute-0 podman[203011]: time="2025-10-02T08:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:45:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:45:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 02 08:46:00 compute-0 podman[228941]: 2025-10-02 08:46:00.164992185 +0000 UTC m=+0.079855047 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 02 08:46:01 compute-0 openstack_network_exporter[205118]: ERROR   08:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:46:01 compute-0 openstack_network_exporter[205118]: ERROR   08:46:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:46:01 compute-0 openstack_network_exporter[205118]: ERROR   08:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:46:01 compute-0 openstack_network_exporter[205118]: ERROR   08:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:46:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:46:01 compute-0 openstack_network_exporter[205118]: ERROR   08:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:46:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:46:01 compute-0 nova_compute[192567]: 2025-10-02 08:46:01.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:03 compute-0 nova_compute[192567]: 2025-10-02 08:46:03.114 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:46:04 compute-0 nova_compute[192567]: 2025-10-02 08:46:04.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:04 compute-0 nova_compute[192567]: 2025-10-02 08:46:04.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:46:06 compute-0 nova_compute[192567]: 2025-10-02 08:46:06.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:07 compute-0 nova_compute[192567]: 2025-10-02 08:46:07.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:46:07 compute-0 nova_compute[192567]: 2025-10-02 08:46:07.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:46:09 compute-0 nova_compute[192567]: 2025-10-02 08:46:09.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:09 compute-0 nova_compute[192567]: 2025-10-02 08:46:09.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:46:11 compute-0 nova_compute[192567]: 2025-10-02 08:46:11.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:14 compute-0 nova_compute[192567]: 2025-10-02 08:46:14.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:15 compute-0 podman[228962]: 2025-10-02 08:46:15.197370981 +0000 UTC m=+0.099319104 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 02 08:46:15 compute-0 podman[228965]: 2025-10-02 08:46:15.221734589 +0000 UTC m=+0.110026146 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2)
Oct 02 08:46:15 compute-0 podman[228963]: 2025-10-02 08:46:15.244813678 +0000 UTC m=+0.145041647 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 02 08:46:15 compute-0 podman[228964]: 2025-10-02 08:46:15.249274276 +0000 UTC m=+0.143464307 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 02 08:46:16 compute-0 nova_compute[192567]: 2025-10-02 08:46:16.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:19 compute-0 nova_compute[192567]: 2025-10-02 08:46:19.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:21 compute-0 podman[229042]: 2025-10-02 08:46:21.184185658 +0000 UTC m=+0.085769971 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:46:21 compute-0 nova_compute[192567]: 2025-10-02 08:46:21.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:24 compute-0 nova_compute[192567]: 2025-10-02 08:46:24.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:26 compute-0 nova_compute[192567]: 2025-10-02 08:46:26.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:46:29.249 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:46:29 compute-0 nova_compute[192567]: 2025-10-02 08:46:29.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:46:29.250 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:46:29 compute-0 nova_compute[192567]: 2025-10-02 08:46:29.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:29 compute-0 podman[203011]: time="2025-10-02T08:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:46:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:46:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Oct 02 08:46:31 compute-0 podman[229066]: 2025-10-02 08:46:31.202363477 +0000 UTC m=+0.114733763 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Oct 02 08:46:31 compute-0 openstack_network_exporter[205118]: ERROR   08:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:46:31 compute-0 openstack_network_exporter[205118]: ERROR   08:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:46:31 compute-0 openstack_network_exporter[205118]: ERROR   08:46:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:46:31 compute-0 openstack_network_exporter[205118]: ERROR   08:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:46:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:46:31 compute-0 openstack_network_exporter[205118]: ERROR   08:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:46:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:46:31 compute-0 nova_compute[192567]: 2025-10-02 08:46:31.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:32 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:46:32.253 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:46:34 compute-0 nova_compute[192567]: 2025-10-02 08:46:34.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:36 compute-0 nova_compute[192567]: 2025-10-02 08:46:36.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:39 compute-0 nova_compute[192567]: 2025-10-02 08:46:39.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:41 compute-0 nova_compute[192567]: 2025-10-02 08:46:41.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:44 compute-0 nova_compute[192567]: 2025-10-02 08:46:44.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:46:46.011 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:46:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:46:46.012 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:46:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:46:46.012 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:46:46 compute-0 podman[229087]: 2025-10-02 08:46:46.167651772 +0000 UTC m=+0.073542220 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct 02 08:46:46 compute-0 podman[229095]: 2025-10-02 08:46:46.211010611 +0000 UTC m=+0.086084351 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=iscsid, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 02 08:46:46 compute-0 podman[229089]: 2025-10-02 08:46:46.215508722 +0000 UTC m=+0.103627077 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct 02 08:46:46 compute-0 podman[229088]: 2025-10-02 08:46:46.276398467 +0000 UTC m=+0.172964515 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:46:46 compute-0 nova_compute[192567]: 2025-10-02 08:46:46.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:49 compute-0 nova_compute[192567]: 2025-10-02 08:46:49.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:51 compute-0 nova_compute[192567]: 2025-10-02 08:46:51.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:52 compute-0 podman[229170]: 2025-10-02 08:46:52.160647254 +0000 UTC m=+0.074943524 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:46:52 compute-0 nova_compute[192567]: 2025-10-02 08:46:52.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:46:52 compute-0 nova_compute[192567]: 2025-10-02 08:46:52.626 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:46:52 compute-0 nova_compute[192567]: 2025-10-02 08:46:52.626 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:46:52 compute-0 nova_compute[192567]: 2025-10-02 08:46:52.673 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:46:53 compute-0 nova_compute[192567]: 2025-10-02 08:46:53.669 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:46:54 compute-0 nova_compute[192567]: 2025-10-02 08:46:54.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:56 compute-0 nova_compute[192567]: 2025-10-02 08:46:56.628 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:46:56 compute-0 nova_compute[192567]: 2025-10-02 08:46:56.629 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 02 08:46:56 compute-0 nova_compute[192567]: 2025-10-02 08:46:56.654 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 02 08:46:56 compute-0 nova_compute[192567]: 2025-10-02 08:46:56.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:58 compute-0 nova_compute[192567]: 2025-10-02 08:46:58.653 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:46:59 compute-0 nova_compute[192567]: 2025-10-02 08:46:59.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:46:59 compute-0 nova_compute[192567]: 2025-10-02 08:46:59.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:46:59 compute-0 nova_compute[192567]: 2025-10-02 08:46:59.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:46:59 compute-0 nova_compute[192567]: 2025-10-02 08:46:59.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:46:59 compute-0 nova_compute[192567]: 2025-10-02 08:46:59.626 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 02 08:46:59 compute-0 podman[203011]: time="2025-10-02T08:46:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:46:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:46:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:46:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:46:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.648 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.679 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.680 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.680 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.680 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.868 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.870 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5882MB free_disk=73.46175384521484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.870 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.870 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.944 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.945 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.967 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.987 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.988 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:47:00 compute-0 nova_compute[192567]: 2025-10-02 08:47:00.988 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:47:01 compute-0 openstack_network_exporter[205118]: ERROR   08:47:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:47:01 compute-0 openstack_network_exporter[205118]: ERROR   08:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:47:01 compute-0 openstack_network_exporter[205118]: ERROR   08:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:47:01 compute-0 openstack_network_exporter[205118]: ERROR   08:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:47:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:47:01 compute-0 openstack_network_exporter[205118]: ERROR   08:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:47:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:47:01 compute-0 anacron[190925]: Job `cron.weekly' started
Oct 02 08:47:01 compute-0 anacron[190925]: Job `cron.weekly' terminated
Oct 02 08:47:01 compute-0 nova_compute[192567]: 2025-10-02 08:47:01.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:02 compute-0 podman[229195]: 2025-10-02 08:47:02.153643378 +0000 UTC m=+0.066889404 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, release=1755695350)
Oct 02 08:47:03 compute-0 unix_chkpwd[229218]: password check failed for user (root)
Oct 02 08:47:03 compute-0 sshd-session[229216]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.108  user=root
Oct 02 08:47:03 compute-0 nova_compute[192567]: 2025-10-02 08:47:03.966 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:47:04 compute-0 nova_compute[192567]: 2025-10-02 08:47:04.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:05 compute-0 sshd-session[229216]: Failed password for root from 91.224.92.108 port 41596 ssh2
Oct 02 08:47:06 compute-0 nova_compute[192567]: 2025-10-02 08:47:06.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:07 compute-0 nova_compute[192567]: 2025-10-02 08:47:07.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:47:07 compute-0 nova_compute[192567]: 2025-10-02 08:47:07.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:47:07 compute-0 unix_chkpwd[229219]: password check failed for user (root)
Oct 02 08:47:09 compute-0 nova_compute[192567]: 2025-10-02 08:47:09.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:09 compute-0 sshd-session[229216]: Failed password for root from 91.224.92.108 port 41596 ssh2
Oct 02 08:47:09 compute-0 nova_compute[192567]: 2025-10-02 08:47:09.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:47:10 compute-0 unix_chkpwd[229220]: password check failed for user (root)
Oct 02 08:47:11 compute-0 nova_compute[192567]: 2025-10-02 08:47:11.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:12 compute-0 sshd-session[229216]: Failed password for root from 91.224.92.108 port 41596 ssh2
Oct 02 08:47:14 compute-0 sshd-session[229216]: Received disconnect from 91.224.92.108 port 41596:11:  [preauth]
Oct 02 08:47:14 compute-0 sshd-session[229216]: Disconnected from authenticating user root 91.224.92.108 port 41596 [preauth]
Oct 02 08:47:14 compute-0 sshd-session[229216]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.108  user=root
Oct 02 08:47:14 compute-0 nova_compute[192567]: 2025-10-02 08:47:14.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:16 compute-0 nova_compute[192567]: 2025-10-02 08:47:16.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:17 compute-0 podman[229223]: 2025-10-02 08:47:17.215535641 +0000 UTC m=+0.120333167 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 02 08:47:17 compute-0 podman[229226]: 2025-10-02 08:47:17.221318982 +0000 UTC m=+0.102923546 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct 02 08:47:17 compute-0 podman[229225]: 2025-10-02 08:47:17.230318182 +0000 UTC m=+0.113847036 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:47:17 compute-0 podman[229224]: 2025-10-02 08:47:17.2569456 +0000 UTC m=+0.150487706 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible)
Oct 02 08:47:18 compute-0 unix_chkpwd[229304]: password check failed for user (root)
Oct 02 08:47:18 compute-0 sshd-session[229221]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.108  user=root
Oct 02 08:47:19 compute-0 nova_compute[192567]: 2025-10-02 08:47:19.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:19 compute-0 sshd-session[229221]: Failed password for root from 91.224.92.108 port 31874 ssh2
Oct 02 08:47:20 compute-0 unix_chkpwd[229305]: password check failed for user (root)
Oct 02 08:47:21 compute-0 sshd-session[229221]: Failed password for root from 91.224.92.108 port 31874 ssh2
Oct 02 08:47:21 compute-0 nova_compute[192567]: 2025-10-02 08:47:21.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:22 compute-0 unix_chkpwd[229306]: password check failed for user (root)
Oct 02 08:47:22 compute-0 nova_compute[192567]: 2025-10-02 08:47:22.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:47:23 compute-0 podman[229307]: 2025-10-02 08:47:23.187747255 +0000 UTC m=+0.088637410 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:47:24 compute-0 nova_compute[192567]: 2025-10-02 08:47:24.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:25 compute-0 sshd-session[229221]: Failed password for root from 91.224.92.108 port 31874 ssh2
Oct 02 08:47:26 compute-0 nova_compute[192567]: 2025-10-02 08:47:26.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:27 compute-0 sshd-session[229221]: Received disconnect from 91.224.92.108 port 31874:11:  [preauth]
Oct 02 08:47:27 compute-0 sshd-session[229221]: Disconnected from authenticating user root 91.224.92.108 port 31874 [preauth]
Oct 02 08:47:27 compute-0 sshd-session[229221]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.108  user=root
Oct 02 08:47:28 compute-0 unix_chkpwd[229333]: password check failed for user (root)
Oct 02 08:47:28 compute-0 sshd-session[229331]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.108  user=root
Oct 02 08:47:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:47:29.243 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:47:29 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:47:29.245 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:47:29 compute-0 nova_compute[192567]: 2025-10-02 08:47:29.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:29 compute-0 nova_compute[192567]: 2025-10-02 08:47:29.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:29 compute-0 podman[203011]: time="2025-10-02T08:47:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:47:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:47:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:47:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:47:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3005 "" "Go-http-client/1.1"
Oct 02 08:47:29 compute-0 sshd-session[229331]: Failed password for root from 91.224.92.108 port 46208 ssh2
Oct 02 08:47:30 compute-0 unix_chkpwd[229334]: password check failed for user (root)
Oct 02 08:47:31 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:47:31.247 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:47:31 compute-0 openstack_network_exporter[205118]: ERROR   08:47:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:47:31 compute-0 openstack_network_exporter[205118]: ERROR   08:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:47:31 compute-0 openstack_network_exporter[205118]: ERROR   08:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:47:31 compute-0 openstack_network_exporter[205118]: ERROR   08:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:47:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:47:31 compute-0 openstack_network_exporter[205118]: ERROR   08:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:47:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:47:31 compute-0 nova_compute[192567]: 2025-10-02 08:47:31.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:32 compute-0 sshd-session[229331]: Failed password for root from 91.224.92.108 port 46208 ssh2
Oct 02 08:47:32 compute-0 unix_chkpwd[229335]: password check failed for user (root)
Oct 02 08:47:33 compute-0 podman[229336]: 2025-10-02 08:47:33.206373627 +0000 UTC m=+0.106341971 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 02 08:47:33 compute-0 sshd-session[229331]: Failed password for root from 91.224.92.108 port 46208 ssh2
Oct 02 08:47:34 compute-0 nova_compute[192567]: 2025-10-02 08:47:34.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:34 compute-0 sshd-session[229331]: Received disconnect from 91.224.92.108 port 46208:11:  [preauth]
Oct 02 08:47:34 compute-0 sshd-session[229331]: Disconnected from authenticating user root 91.224.92.108 port 46208 [preauth]
Oct 02 08:47:34 compute-0 sshd-session[229331]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.108  user=root
Oct 02 08:47:36 compute-0 nova_compute[192567]: 2025-10-02 08:47:36.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:39 compute-0 nova_compute[192567]: 2025-10-02 08:47:39.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:41 compute-0 nova_compute[192567]: 2025-10-02 08:47:41.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:44 compute-0 nova_compute[192567]: 2025-10-02 08:47:44.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:45 compute-0 sshd-session[229358]: error: kex_exchange_identification: read: Connection reset by peer
Oct 02 08:47:45 compute-0 sshd-session[229358]: Connection reset by 45.140.17.97 port 15184
Oct 02 08:47:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:47:46.012 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:47:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:47:46.013 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:47:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:47:46.014 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:47:46 compute-0 nova_compute[192567]: 2025-10-02 08:47:46.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:48 compute-0 podman[229362]: 2025-10-02 08:47:48.186783475 +0000 UTC m=+0.070219077 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 02 08:47:48 compute-0 podman[229361]: 2025-10-02 08:47:48.192448032 +0000 UTC m=+0.079948490 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 02 08:47:48 compute-0 podman[229359]: 2025-10-02 08:47:48.198314165 +0000 UTC m=+0.094087311 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:47:48 compute-0 podman[229360]: 2025-10-02 08:47:48.22517288 +0000 UTC m=+0.120223463 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:47:49 compute-0 nova_compute[192567]: 2025-10-02 08:47:49.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:51 compute-0 nova_compute[192567]: 2025-10-02 08:47:51.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:52 compute-0 nova_compute[192567]: 2025-10-02 08:47:52.639 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:47:52 compute-0 nova_compute[192567]: 2025-10-02 08:47:52.639 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:47:52 compute-0 nova_compute[192567]: 2025-10-02 08:47:52.640 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:47:52 compute-0 nova_compute[192567]: 2025-10-02 08:47:52.668 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:47:54 compute-0 podman[229440]: 2025-10-02 08:47:54.149869806 +0000 UTC m=+0.065730558 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:47:54 compute-0 nova_compute[192567]: 2025-10-02 08:47:54.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:55 compute-0 nova_compute[192567]: 2025-10-02 08:47:55.649 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:47:56 compute-0 nova_compute[192567]: 2025-10-02 08:47:56.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:59 compute-0 nova_compute[192567]: 2025-10-02 08:47:59.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:47:59 compute-0 nova_compute[192567]: 2025-10-02 08:47:59.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:47:59 compute-0 podman[203011]: time="2025-10-02T08:47:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:47:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:47:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:47:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:47:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3007 "" "Go-http-client/1.1"
Oct 02 08:48:00 compute-0 nova_compute[192567]: 2025-10-02 08:48:00.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:48:00 compute-0 nova_compute[192567]: 2025-10-02 08:48:00.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:48:00 compute-0 nova_compute[192567]: 2025-10-02 08:48:00.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:48:00 compute-0 nova_compute[192567]: 2025-10-02 08:48:00.655 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:48:00 compute-0 nova_compute[192567]: 2025-10-02 08:48:00.656 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:48:00 compute-0 nova_compute[192567]: 2025-10-02 08:48:00.656 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:48:00 compute-0 nova_compute[192567]: 2025-10-02 08:48:00.657 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:48:00 compute-0 nova_compute[192567]: 2025-10-02 08:48:00.868 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:48:00 compute-0 nova_compute[192567]: 2025-10-02 08:48:00.870 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5887MB free_disk=73.46176528930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:48:00 compute-0 nova_compute[192567]: 2025-10-02 08:48:00.870 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:48:00 compute-0 nova_compute[192567]: 2025-10-02 08:48:00.871 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:48:01 compute-0 nova_compute[192567]: 2025-10-02 08:48:01.024 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:48:01 compute-0 nova_compute[192567]: 2025-10-02 08:48:01.025 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:48:01 compute-0 nova_compute[192567]: 2025-10-02 08:48:01.057 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:48:01 compute-0 nova_compute[192567]: 2025-10-02 08:48:01.086 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:48:01 compute-0 nova_compute[192567]: 2025-10-02 08:48:01.090 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:48:01 compute-0 nova_compute[192567]: 2025-10-02 08:48:01.090 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:48:01 compute-0 openstack_network_exporter[205118]: ERROR   08:48:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:48:01 compute-0 openstack_network_exporter[205118]: ERROR   08:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:48:01 compute-0 openstack_network_exporter[205118]: ERROR   08:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:48:01 compute-0 openstack_network_exporter[205118]: ERROR   08:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:48:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:48:01 compute-0 openstack_network_exporter[205118]: ERROR   08:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:48:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:48:01 compute-0 nova_compute[192567]: 2025-10-02 08:48:01.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:04 compute-0 podman[229464]: 2025-10-02 08:48:04.185868418 +0000 UTC m=+0.092493811 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, config_id=edpm, distribution-scope=public, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Oct 02 08:48:04 compute-0 nova_compute[192567]: 2025-10-02 08:48:04.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:05 compute-0 nova_compute[192567]: 2025-10-02 08:48:05.091 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:48:06 compute-0 nova_compute[192567]: 2025-10-02 08:48:06.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:07 compute-0 nova_compute[192567]: 2025-10-02 08:48:07.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:48:07 compute-0 nova_compute[192567]: 2025-10-02 08:48:07.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:48:08 compute-0 nova_compute[192567]: 2025-10-02 08:48:08.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:48:09 compute-0 nova_compute[192567]: 2025-10-02 08:48:09.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:10 compute-0 nova_compute[192567]: 2025-10-02 08:48:10.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:48:11 compute-0 nova_compute[192567]: 2025-10-02 08:48:11.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:11 compute-0 nova_compute[192567]: 2025-10-02 08:48:11.946 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "ad53b0ad-a45d-429f-9f25-0da979224b83" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:48:11 compute-0 nova_compute[192567]: 2025-10-02 08:48:11.947 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:48:11 compute-0 nova_compute[192567]: 2025-10-02 08:48:11.963 2 DEBUG nova.compute.manager [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.065 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.066 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.076 2 DEBUG nova.virt.hardware [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.077 2 INFO nova.compute.claims [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Claim successful on node compute-0.ctlplane.example.com
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.207 2 DEBUG nova.compute.provider_tree [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.232 2 DEBUG nova.scheduler.client.report [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.284 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.285 2 DEBUG nova.compute.manager [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.341 2 DEBUG nova.compute.manager [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.342 2 DEBUG nova.network.neutron [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.373 2 INFO nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.398 2 DEBUG nova.compute.manager [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.503 2 DEBUG nova.compute.manager [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.505 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.505 2 INFO nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Creating image(s)
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.506 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "/var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.507 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "/var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.508 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "/var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.533 2 DEBUG oslo_concurrency.processutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.632 2 DEBUG oslo_concurrency.processutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.633 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.634 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.645 2 DEBUG oslo_concurrency.processutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.737 2 DEBUG oslo_concurrency.processutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.739 2 DEBUG oslo_concurrency.processutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.776 2 DEBUG oslo_concurrency.processutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.778 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.778 2 DEBUG oslo_concurrency.processutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.846 2 DEBUG oslo_concurrency.processutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.848 2 DEBUG nova.virt.disk.api [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Checking if we can resize image /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.849 2 DEBUG oslo_concurrency.processutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.910 2 DEBUG oslo_concurrency.processutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.911 2 DEBUG nova.virt.disk.api [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Cannot resize image /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.912 2 DEBUG nova.objects.instance [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lazy-loading 'migration_context' on Instance uuid ad53b0ad-a45d-429f-9f25-0da979224b83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.931 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.932 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Ensure instance console log exists: /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.933 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.933 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:48:12 compute-0 nova_compute[192567]: 2025-10-02 08:48:12.934 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:48:13 compute-0 nova_compute[192567]: 2025-10-02 08:48:13.235 2 DEBUG nova.network.neutron [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Successfully created port: 6b981388-21bd-4946-8422-9a7fcd19cd4a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 02 08:48:14 compute-0 nova_compute[192567]: 2025-10-02 08:48:14.238 2 DEBUG nova.network.neutron [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Successfully updated port: 6b981388-21bd-4946-8422-9a7fcd19cd4a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 02 08:48:14 compute-0 nova_compute[192567]: 2025-10-02 08:48:14.253 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "refresh_cache-ad53b0ad-a45d-429f-9f25-0da979224b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:48:14 compute-0 nova_compute[192567]: 2025-10-02 08:48:14.254 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquired lock "refresh_cache-ad53b0ad-a45d-429f-9f25-0da979224b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:48:14 compute-0 nova_compute[192567]: 2025-10-02 08:48:14.254 2 DEBUG nova.network.neutron [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:48:14 compute-0 nova_compute[192567]: 2025-10-02 08:48:14.333 2 DEBUG nova.compute.manager [req-efc41ec6-aaaa-47e1-b6b1-04b3ee858369 req-fbd4af0b-cc73-416d-9cbd-3175190c9419 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Received event network-changed-6b981388-21bd-4946-8422-9a7fcd19cd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:48:14 compute-0 nova_compute[192567]: 2025-10-02 08:48:14.334 2 DEBUG nova.compute.manager [req-efc41ec6-aaaa-47e1-b6b1-04b3ee858369 req-fbd4af0b-cc73-416d-9cbd-3175190c9419 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Refreshing instance network info cache due to event network-changed-6b981388-21bd-4946-8422-9a7fcd19cd4a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 02 08:48:14 compute-0 nova_compute[192567]: 2025-10-02 08:48:14.334 2 DEBUG oslo_concurrency.lockutils [req-efc41ec6-aaaa-47e1-b6b1-04b3ee858369 req-fbd4af0b-cc73-416d-9cbd-3175190c9419 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-ad53b0ad-a45d-429f-9f25-0da979224b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:48:14 compute-0 nova_compute[192567]: 2025-10-02 08:48:14.412 2 DEBUG nova.network.neutron [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 02 08:48:14 compute-0 nova_compute[192567]: 2025-10-02 08:48:14.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.088 2 DEBUG nova.network.neutron [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Updating instance_info_cache with network_info: [{"id": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "address": "fa:16:3e:2a:4f:82", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b981388-21", "ovs_interfaceid": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.107 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Releasing lock "refresh_cache-ad53b0ad-a45d-429f-9f25-0da979224b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.108 2 DEBUG nova.compute.manager [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Instance network_info: |[{"id": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "address": "fa:16:3e:2a:4f:82", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b981388-21", "ovs_interfaceid": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.108 2 DEBUG oslo_concurrency.lockutils [req-efc41ec6-aaaa-47e1-b6b1-04b3ee858369 req-fbd4af0b-cc73-416d-9cbd-3175190c9419 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-ad53b0ad-a45d-429f-9f25-0da979224b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.109 2 DEBUG nova.network.neutron [req-efc41ec6-aaaa-47e1-b6b1-04b3ee858369 req-fbd4af0b-cc73-416d-9cbd-3175190c9419 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Refreshing network info cache for port 6b981388-21bd-4946-8422-9a7fcd19cd4a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.114 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Start _get_guest_xml network_info=[{"id": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "address": "fa:16:3e:2a:4f:82", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b981388-21", "ovs_interfaceid": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'guest_format': None, 'disk_bus': 'virtio', 'image_id': 'f5cf0efc-6f3c-4865-b002-490e9c9b250d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.122 2 WARNING nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.132 2 DEBUG nova.virt.libvirt.host [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.133 2 DEBUG nova.virt.libvirt.host [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.138 2 DEBUG nova.virt.libvirt.host [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.139 2 DEBUG nova.virt.libvirt.host [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.140 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.141 2 DEBUG nova.virt.hardware [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-02T08:06:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='932d352e-81e8-4137-94d3-19616d5c2ae2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-02T08:06:23Z,direct_url=<?>,disk_format='qcow2',id=f5cf0efc-6f3c-4865-b002-490e9c9b250d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a46cbd7217a541c58391886cae342f44',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-02T08:06:24Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.141 2 DEBUG nova.virt.hardware [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.142 2 DEBUG nova.virt.hardware [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.142 2 DEBUG nova.virt.hardware [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.142 2 DEBUG nova.virt.hardware [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.143 2 DEBUG nova.virt.hardware [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.143 2 DEBUG nova.virt.hardware [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.144 2 DEBUG nova.virt.hardware [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.144 2 DEBUG nova.virt.hardware [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.144 2 DEBUG nova.virt.hardware [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.145 2 DEBUG nova.virt.hardware [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.153 2 DEBUG nova.virt.libvirt.vif [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:48:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-319216127',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-319216127',id=38,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac74327999c44d3eb46d6c8280531147',ramdisk_id='',reservation_id='r-9wnpj0to',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1420949844',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1420949844-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:48:12Z,user_data=None,user_id='796383ba775b4e63982ab22e1ab7e3e4',uuid=ad53b0ad-a45d-429f-9f25-0da979224b83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "address": "fa:16:3e:2a:4f:82", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b981388-21", "ovs_interfaceid": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.154 2 DEBUG nova.network.os_vif_util [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Converting VIF {"id": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "address": "fa:16:3e:2a:4f:82", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b981388-21", "ovs_interfaceid": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.155 2 DEBUG nova.network.os_vif_util [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:4f:82,bridge_name='br-int',has_traffic_filtering=True,id=6b981388-21bd-4946-8422-9a7fcd19cd4a,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b981388-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.156 2 DEBUG nova.objects.instance [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lazy-loading 'pci_devices' on Instance uuid ad53b0ad-a45d-429f-9f25-0da979224b83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.172 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] End _get_guest_xml xml=<domain type="kvm">
Oct 02 08:48:15 compute-0 nova_compute[192567]:   <uuid>ad53b0ad-a45d-429f-9f25-0da979224b83</uuid>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   <name>instance-00000026</name>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   <memory>131072</memory>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   <vcpu>1</vcpu>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   <metadata>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-319216127</nova:name>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <nova:creationTime>2025-10-02 08:48:15</nova:creationTime>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <nova:flavor name="m1.nano">
Oct 02 08:48:15 compute-0 nova_compute[192567]:         <nova:memory>128</nova:memory>
Oct 02 08:48:15 compute-0 nova_compute[192567]:         <nova:disk>1</nova:disk>
Oct 02 08:48:15 compute-0 nova_compute[192567]:         <nova:swap>0</nova:swap>
Oct 02 08:48:15 compute-0 nova_compute[192567]:         <nova:ephemeral>0</nova:ephemeral>
Oct 02 08:48:15 compute-0 nova_compute[192567]:         <nova:vcpus>1</nova:vcpus>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       </nova:flavor>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <nova:owner>
Oct 02 08:48:15 compute-0 nova_compute[192567]:         <nova:user uuid="796383ba775b4e63982ab22e1ab7e3e4">tempest-TestExecuteZoneMigrationStrategy-1420949844-project-admin</nova:user>
Oct 02 08:48:15 compute-0 nova_compute[192567]:         <nova:project uuid="ac74327999c44d3eb46d6c8280531147">tempest-TestExecuteZoneMigrationStrategy-1420949844</nova:project>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       </nova:owner>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <nova:root type="image" uuid="f5cf0efc-6f3c-4865-b002-490e9c9b250d"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <nova:ports>
Oct 02 08:48:15 compute-0 nova_compute[192567]:         <nova:port uuid="6b981388-21bd-4946-8422-9a7fcd19cd4a">
Oct 02 08:48:15 compute-0 nova_compute[192567]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:         </nova:port>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       </nova:ports>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     </nova:instance>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   </metadata>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   <sysinfo type="smbios">
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <system>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <entry name="manufacturer">RDO</entry>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <entry name="product">OpenStack Compute</entry>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <entry name="serial">ad53b0ad-a45d-429f-9f25-0da979224b83</entry>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <entry name="uuid">ad53b0ad-a45d-429f-9f25-0da979224b83</entry>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <entry name="family">Virtual Machine</entry>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     </system>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   </sysinfo>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   <os>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <boot dev="hd"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <smbios mode="sysinfo"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   </os>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   <features>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <acpi/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <apic/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <vmcoreinfo/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   </features>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   <clock offset="utc">
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <timer name="pit" tickpolicy="delay"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <timer name="hpet" present="no"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   </clock>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   <cpu mode="host-model" match="exact">
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <topology sockets="1" cores="1" threads="1"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   </cpu>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   <devices>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <disk type="file" device="disk">
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <target dev="vda" bus="virtio"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <disk type="file" device="cdrom">
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <driver name="qemu" type="raw" cache="none"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <source file="/var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk.config"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <target dev="sda" bus="sata"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     </disk>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <interface type="ethernet">
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <mac address="fa:16:3e:2a:4f:82"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <driver name="vhost" rx_queue_size="512"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <mtu size="1442"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <target dev="tap6b981388-21"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     </interface>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <serial type="pty">
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <log file="/var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/console.log" append="off"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     </serial>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <video>
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <model type="virtio"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     </video>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <input type="tablet" bus="usb"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <rng model="virtio">
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <backend model="random">/dev/urandom</backend>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     </rng>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="pci" model="pcie-root-port"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <controller type="usb" index="0"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     <memballoon model="virtio">
Oct 02 08:48:15 compute-0 nova_compute[192567]:       <stats period="10"/>
Oct 02 08:48:15 compute-0 nova_compute[192567]:     </memballoon>
Oct 02 08:48:15 compute-0 nova_compute[192567]:   </devices>
Oct 02 08:48:15 compute-0 nova_compute[192567]: </domain>
Oct 02 08:48:15 compute-0 nova_compute[192567]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.173 2 DEBUG nova.compute.manager [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Preparing to wait for external event network-vif-plugged-6b981388-21bd-4946-8422-9a7fcd19cd4a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.173 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.174 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.174 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.175 2 DEBUG nova.virt.libvirt.vif [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-02T08:48:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-319216127',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-319216127',id=38,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ac74327999c44d3eb46d6c8280531147',ramdisk_id='',reservation_id='r-9wnpj0to',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1420949844',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1420949844-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:48:12Z,user_data=None,user_id='796383ba775b4e63982ab22e1ab7e3e4',uuid=ad53b0ad-a45d-429f-9f25-0da979224b83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "address": "fa:16:3e:2a:4f:82", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b981388-21", "ovs_interfaceid": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.176 2 DEBUG nova.network.os_vif_util [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Converting VIF {"id": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "address": "fa:16:3e:2a:4f:82", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b981388-21", "ovs_interfaceid": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.176 2 DEBUG nova.network.os_vif_util [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:4f:82,bridge_name='br-int',has_traffic_filtering=True,id=6b981388-21bd-4946-8422-9a7fcd19cd4a,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b981388-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.184 2 DEBUG os_vif [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:4f:82,bridge_name='br-int',has_traffic_filtering=True,id=6b981388-21bd-4946-8422-9a7fcd19cd4a,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b981388-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.188 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.189 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.196 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b981388-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.197 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6b981388-21, col_values=(('external_ids', {'iface-id': '6b981388-21bd-4946-8422-9a7fcd19cd4a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2a:4f:82', 'vm-uuid': 'ad53b0ad-a45d-429f-9f25-0da979224b83'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:48:15 compute-0 NetworkManager[51654]: <info>  [1759394895.2033] manager: (tap6b981388-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.213 2 INFO os_vif [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:4f:82,bridge_name='br-int',has_traffic_filtering=True,id=6b981388-21bd-4946-8422-9a7fcd19cd4a,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b981388-21')
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.278 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.278 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.279 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] No VIF found with MAC fa:16:3e:2a:4f:82, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.280 2 INFO nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Using config drive
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.653 2 INFO nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Creating config drive at /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk.config
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.662 2 DEBUG oslo_concurrency.processutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnjd55c00 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.798 2 DEBUG oslo_concurrency.processutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnjd55c00" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:48:15 compute-0 kernel: tap6b981388-21: entered promiscuous mode
Oct 02 08:48:15 compute-0 NetworkManager[51654]: <info>  [1759394895.9164] manager: (tap6b981388-21): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:15 compute-0 ovn_controller[94821]: 2025-10-02T08:48:15Z|00254|binding|INFO|Claiming lport 6b981388-21bd-4946-8422-9a7fcd19cd4a for this chassis.
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:15 compute-0 ovn_controller[94821]: 2025-10-02T08:48:15Z|00255|binding|INFO|6b981388-21bd-4946-8422-9a7fcd19cd4a: Claiming fa:16:3e:2a:4f:82 10.100.0.5
Oct 02 08:48:15 compute-0 nova_compute[192567]: 2025-10-02 08:48:15.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:15 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:15.943 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:4f:82 10.100.0.5'], port_security=['fa:16:3e:2a:4f:82 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ad53b0ad-a45d-429f-9f25-0da979224b83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac74327999c44d3eb46d6c8280531147', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed8f6343-b875-4638-92b8-f2446903f322', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a95bbfad-5b25-4f75-9394-85dc68a3a437, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=6b981388-21bd-4946-8422-9a7fcd19cd4a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:48:15 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:15.945 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 6b981388-21bd-4946-8422-9a7fcd19cd4a in datapath dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d bound to our chassis
Oct 02 08:48:15 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:15.948 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d
Oct 02 08:48:15 compute-0 systemd-udevd[229518]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:48:15 compute-0 systemd-machined[152597]: New machine qemu-24-instance-00000026.
Oct 02 08:48:15 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:15.972 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e01420bb-cbcc-49fd-a1f1-27b973c6796a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:15 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:15.974 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdfb4a911-c1 in ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 02 08:48:15 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:15.977 215188 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdfb4a911-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 02 08:48:15 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:15.977 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[17ae5b02-dabd-4347-831f-c248a1c1482e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:15 compute-0 NetworkManager[51654]: <info>  [1759394895.9800] device (tap6b981388-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:48:15 compute-0 NetworkManager[51654]: <info>  [1759394895.9811] device (tap6b981388-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:48:15 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:15.979 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[4c985892-5e1c-499f-a65b-09d9cc806abb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-00000026.
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:16 compute-0 ovn_controller[94821]: 2025-10-02T08:48:16Z|00256|binding|INFO|Setting lport 6b981388-21bd-4946-8422-9a7fcd19cd4a ovn-installed in OVS
Oct 02 08:48:16 compute-0 ovn_controller[94821]: 2025-10-02T08:48:16Z|00257|binding|INFO|Setting lport 6b981388-21bd-4946-8422-9a7fcd19cd4a up in Southbound
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.003 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[ad4c23a1-5902-4fe3-94ca-4f39e9e76a66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.040 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8026828e-81c1-4f0d-99e7-80018ef86b72]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.107 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0717d2-d9a3-4fae-84a2-028160ce95f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.116 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[306a1b51-fd79-4a3b-9eff-fd6dcfa86767]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 NetworkManager[51654]: <info>  [1759394896.1183] manager: (tapdfb4a911-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/99)
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.171 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[0dde7b8b-e7af-4b6a-ad51-db9a40f14792]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.176 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[f92d1170-ab70-4004-9bd1-843c37efae62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 NetworkManager[51654]: <info>  [1759394896.2206] device (tapdfb4a911-c0): carrier: link connected
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.228 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[f04473bc-418f-4af2-8374-6c0ec52a024b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.255 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[4a31fd09-3b65-43ad-88a9-d24537d96768]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdfb4a911-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:27:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578381, 'reachable_time': 19308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229552, 'error': None, 'target': 'ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.283 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e6af2ce1-c961-48aa-ba02-462760209e50]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:27bd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 578381, 'tstamp': 578381}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229554, 'error': None, 'target': 'ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.309 2 DEBUG nova.network.neutron [req-efc41ec6-aaaa-47e1-b6b1-04b3ee858369 req-fbd4af0b-cc73-416d-9cbd-3175190c9419 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Updated VIF entry in instance network info cache for port 6b981388-21bd-4946-8422-9a7fcd19cd4a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.312 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[b2bb72a1-154f-4858-87db-f38212e4903b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdfb4a911-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:27:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578381, 'reachable_time': 19308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229555, 'error': None, 'target': 'ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.310 2 DEBUG nova.network.neutron [req-efc41ec6-aaaa-47e1-b6b1-04b3ee858369 req-fbd4af0b-cc73-416d-9cbd-3175190c9419 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Updating instance_info_cache with network_info: [{"id": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "address": "fa:16:3e:2a:4f:82", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b981388-21", "ovs_interfaceid": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.328 2 DEBUG oslo_concurrency.lockutils [req-efc41ec6-aaaa-47e1-b6b1-04b3ee858369 req-fbd4af0b-cc73-416d-9cbd-3175190c9419 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-ad53b0ad-a45d-429f-9f25-0da979224b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.364 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[0d6bbc10-c06f-4777-8777-740c81e0ec4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.418 2 DEBUG nova.compute.manager [req-c6efd83a-337f-4421-ad31-88d2aa00a093 req-4df5a9e3-cd37-4e7c-b063-913e35756974 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Received event network-vif-plugged-6b981388-21bd-4946-8422-9a7fcd19cd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.419 2 DEBUG oslo_concurrency.lockutils [req-c6efd83a-337f-4421-ad31-88d2aa00a093 req-4df5a9e3-cd37-4e7c-b063-913e35756974 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.419 2 DEBUG oslo_concurrency.lockutils [req-c6efd83a-337f-4421-ad31-88d2aa00a093 req-4df5a9e3-cd37-4e7c-b063-913e35756974 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.420 2 DEBUG oslo_concurrency.lockutils [req-c6efd83a-337f-4421-ad31-88d2aa00a093 req-4df5a9e3-cd37-4e7c-b063-913e35756974 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.420 2 DEBUG nova.compute.manager [req-c6efd83a-337f-4421-ad31-88d2aa00a093 req-4df5a9e3-cd37-4e7c-b063-913e35756974 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Processing event network-vif-plugged-6b981388-21bd-4946-8422-9a7fcd19cd4a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.459 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[7a50e85f-af6b-4cee-88e3-b34a531bc118]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.460 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdfb4a911-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.461 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.461 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdfb4a911-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:16 compute-0 NetworkManager[51654]: <info>  [1759394896.4636] manager: (tapdfb4a911-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Oct 02 08:48:16 compute-0 kernel: tapdfb4a911-c0: entered promiscuous mode
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.470 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdfb4a911-c0, col_values=(('external_ids', {'iface-id': 'e37e4669-0eee-4206-a367-e3f45a77615a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:16 compute-0 ovn_controller[94821]: 2025-10-02T08:48:16Z|00258|binding|INFO|Releasing lport e37e4669-0eee-4206-a367-e3f45a77615a from this chassis (sb_readonly=0)
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.497 103703 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.497 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[e49f5d3b-0709-44b3-887e-7f959065bb6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.498 103703 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: global
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     log         /dev/log local0 debug
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     log-tag     haproxy-metadata-proxy-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     user        root
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     group       root
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     maxconn     1024
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     pidfile     /var/lib/neutron/external/pids/dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d.pid.haproxy
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     daemon
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: defaults
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     log global
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     mode http
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     option httplog
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     option dontlognull
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     option http-server-close
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     option forwardfor
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     retries                 3
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     timeout http-request    30s
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     timeout connect         30s
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     timeout client          32s
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     timeout server          32s
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     timeout http-keep-alive 30s
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: listen listener
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     bind 169.254.169.254:80
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     server metadata /var/lib/neutron/metadata_proxy
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:     http-request add-header X-OVN-Network-ID dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 02 08:48:16 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:16.499 103703 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'env', 'PROCESS_TAG=haproxy-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.850 2 DEBUG nova.compute.manager [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.852 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394896.8502295, ad53b0ad-a45d-429f-9f25-0da979224b83 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.852 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] VM Started (Lifecycle Event)
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.855 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.858 2 INFO nova.virt.libvirt.driver [-] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Instance spawned successfully.
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.858 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.894 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.900 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.903 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.903 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.904 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.904 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.904 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.905 2 DEBUG nova.virt.libvirt.driver [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 02 08:48:16 compute-0 podman[229594]: 2025-10-02 08:48:16.941004896 +0000 UTC m=+0.050445731 container create c42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.952 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.952 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394896.8513937, ad53b0ad-a45d-429f-9f25-0da979224b83 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.953 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] VM Paused (Lifecycle Event)
Oct 02 08:48:16 compute-0 systemd[1]: Started libpod-conmon-c42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383.scope.
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.989 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.995 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394896.8547213, ad53b0ad-a45d-429f-9f25-0da979224b83 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:48:16 compute-0 nova_compute[192567]: 2025-10-02 08:48:16.995 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] VM Resumed (Lifecycle Event)
Oct 02 08:48:17 compute-0 systemd[1]: Started libcrun container.
Oct 02 08:48:17 compute-0 podman[229594]: 2025-10-02 08:48:16.915381099 +0000 UTC m=+0.024821964 image pull 269d9fde257fe51bcfc3411ed4c4c36a03b726658e91b83df1028da499438537 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct 02 08:48:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc8eee427fe440cede31350228bc85150247d6bb7d58f66f98ba150f948b5a90/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 02 08:48:17 compute-0 nova_compute[192567]: 2025-10-02 08:48:17.017 2 INFO nova.compute.manager [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Took 4.51 seconds to spawn the instance on the hypervisor.
Oct 02 08:48:17 compute-0 nova_compute[192567]: 2025-10-02 08:48:17.018 2 DEBUG nova.compute.manager [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:48:17 compute-0 nova_compute[192567]: 2025-10-02 08:48:17.101 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:48:17 compute-0 nova_compute[192567]: 2025-10-02 08:48:17.109 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:48:17 compute-0 podman[229594]: 2025-10-02 08:48:17.135040188 +0000 UTC m=+0.244481093 container init c42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 02 08:48:17 compute-0 podman[229594]: 2025-10-02 08:48:17.146923307 +0000 UTC m=+0.256364162 container start c42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 02 08:48:17 compute-0 neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d[229609]: [NOTICE]   (229613) : New worker (229615) forked
Oct 02 08:48:17 compute-0 neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d[229609]: [NOTICE]   (229613) : Loading success.
Oct 02 08:48:17 compute-0 nova_compute[192567]: 2025-10-02 08:48:17.206 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 02 08:48:17 compute-0 nova_compute[192567]: 2025-10-02 08:48:17.395 2 INFO nova.compute.manager [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Took 5.39 seconds to build instance.
Oct 02 08:48:17 compute-0 nova_compute[192567]: 2025-10-02 08:48:17.517 2 DEBUG oslo_concurrency.lockutils [None req-41f7a294-f22b-4ac6-90f4-f838d62dbb51 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:48:18 compute-0 nova_compute[192567]: 2025-10-02 08:48:18.521 2 DEBUG nova.compute.manager [req-a8ab4bdc-c41f-4954-ac86-a6b6fabefbe9 req-d0d89735-3ded-4418-aecb-65539b091e47 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Received event network-vif-plugged-6b981388-21bd-4946-8422-9a7fcd19cd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:48:18 compute-0 nova_compute[192567]: 2025-10-02 08:48:18.522 2 DEBUG oslo_concurrency.lockutils [req-a8ab4bdc-c41f-4954-ac86-a6b6fabefbe9 req-d0d89735-3ded-4418-aecb-65539b091e47 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:48:18 compute-0 nova_compute[192567]: 2025-10-02 08:48:18.523 2 DEBUG oslo_concurrency.lockutils [req-a8ab4bdc-c41f-4954-ac86-a6b6fabefbe9 req-d0d89735-3ded-4418-aecb-65539b091e47 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:48:18 compute-0 nova_compute[192567]: 2025-10-02 08:48:18.523 2 DEBUG oslo_concurrency.lockutils [req-a8ab4bdc-c41f-4954-ac86-a6b6fabefbe9 req-d0d89735-3ded-4418-aecb-65539b091e47 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:48:18 compute-0 nova_compute[192567]: 2025-10-02 08:48:18.524 2 DEBUG nova.compute.manager [req-a8ab4bdc-c41f-4954-ac86-a6b6fabefbe9 req-d0d89735-3ded-4418-aecb-65539b091e47 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] No waiting events found dispatching network-vif-plugged-6b981388-21bd-4946-8422-9a7fcd19cd4a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:48:18 compute-0 nova_compute[192567]: 2025-10-02 08:48:18.524 2 WARNING nova.compute.manager [req-a8ab4bdc-c41f-4954-ac86-a6b6fabefbe9 req-d0d89735-3ded-4418-aecb-65539b091e47 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Received unexpected event network-vif-plugged-6b981388-21bd-4946-8422-9a7fcd19cd4a for instance with vm_state active and task_state None.
Oct 02 08:48:19 compute-0 podman[229624]: 2025-10-02 08:48:19.165374868 +0000 UTC m=+0.072916861 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 02 08:48:19 compute-0 podman[229627]: 2025-10-02 08:48:19.174886304 +0000 UTC m=+0.078312729 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 02 08:48:19 compute-0 podman[229626]: 2025-10-02 08:48:19.176350899 +0000 UTC m=+0.074376466 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 02 08:48:19 compute-0 podman[229625]: 2025-10-02 08:48:19.204520597 +0000 UTC m=+0.109766489 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 02 08:48:19 compute-0 nova_compute[192567]: 2025-10-02 08:48:19.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:20 compute-0 nova_compute[192567]: 2025-10-02 08:48:20.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:24 compute-0 nova_compute[192567]: 2025-10-02 08:48:24.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:25 compute-0 podman[229703]: 2025-10-02 08:48:25.178338109 +0000 UTC m=+0.081847409 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:48:25 compute-0 nova_compute[192567]: 2025-10-02 08:48:25.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:29 compute-0 nova_compute[192567]: 2025-10-02 08:48:29.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:29 compute-0 podman[203011]: time="2025-10-02T08:48:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:48:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:48:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:48:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:48:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3473 "" "Go-http-client/1.1"
Oct 02 08:48:30 compute-0 nova_compute[192567]: 2025-10-02 08:48:30.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:31 compute-0 openstack_network_exporter[205118]: ERROR   08:48:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:48:31 compute-0 openstack_network_exporter[205118]: ERROR   08:48:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:48:31 compute-0 openstack_network_exporter[205118]: ERROR   08:48:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:48:31 compute-0 openstack_network_exporter[205118]: ERROR   08:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:48:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:48:31 compute-0 openstack_network_exporter[205118]: ERROR   08:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:48:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:48:31 compute-0 ovn_controller[94821]: 2025-10-02T08:48:31Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2a:4f:82 10.100.0.5
Oct 02 08:48:31 compute-0 ovn_controller[94821]: 2025-10-02T08:48:31Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2a:4f:82 10.100.0.5
Oct 02 08:48:34 compute-0 nova_compute[192567]: 2025-10-02 08:48:34.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:35 compute-0 podman[229746]: 2025-10-02 08:48:35.169759224 +0000 UTC m=+0.084470041 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 02 08:48:35 compute-0 nova_compute[192567]: 2025-10-02 08:48:35.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:39 compute-0 nova_compute[192567]: 2025-10-02 08:48:39.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:40 compute-0 nova_compute[192567]: 2025-10-02 08:48:40.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:44 compute-0 nova_compute[192567]: 2025-10-02 08:48:44.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:45 compute-0 nova_compute[192567]: 2025-10-02 08:48:45.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:46.014 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:48:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:46.015 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:48:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:48:46.016 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:48:46 compute-0 ovn_controller[94821]: 2025-10-02T08:48:46Z|00259|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 02 08:48:49 compute-0 nova_compute[192567]: 2025-10-02 08:48:49.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:50 compute-0 podman[229767]: 2025-10-02 08:48:50.183219111 +0000 UTC m=+0.089673443 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:48:50 compute-0 podman[229769]: 2025-10-02 08:48:50.216717644 +0000 UTC m=+0.100230121 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:48:50 compute-0 podman[229775]: 2025-10-02 08:48:50.238472302 +0000 UTC m=+0.121187165 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible)
Oct 02 08:48:50 compute-0 podman[229768]: 2025-10-02 08:48:50.249008659 +0000 UTC m=+0.143918051 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:48:50 compute-0 nova_compute[192567]: 2025-10-02 08:48:50.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:54 compute-0 nova_compute[192567]: 2025-10-02 08:48:54.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:54 compute-0 nova_compute[192567]: 2025-10-02 08:48:54.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:48:54 compute-0 nova_compute[192567]: 2025-10-02 08:48:54.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:48:54 compute-0 nova_compute[192567]: 2025-10-02 08:48:54.626 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:48:55 compute-0 nova_compute[192567]: 2025-10-02 08:48:55.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:55 compute-0 nova_compute[192567]: 2025-10-02 08:48:55.834 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-ad53b0ad-a45d-429f-9f25-0da979224b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:48:55 compute-0 nova_compute[192567]: 2025-10-02 08:48:55.834 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-ad53b0ad-a45d-429f-9f25-0da979224b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:48:55 compute-0 nova_compute[192567]: 2025-10-02 08:48:55.835 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:48:55 compute-0 nova_compute[192567]: 2025-10-02 08:48:55.835 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ad53b0ad-a45d-429f-9f25-0da979224b83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:48:56 compute-0 podman[229851]: 2025-10-02 08:48:56.185876172 +0000 UTC m=+0.089504158 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:48:58 compute-0 nova_compute[192567]: 2025-10-02 08:48:58.861 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Updating instance_info_cache with network_info: [{"id": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "address": "fa:16:3e:2a:4f:82", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b981388-21", "ovs_interfaceid": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:48:58 compute-0 nova_compute[192567]: 2025-10-02 08:48:58.890 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-ad53b0ad-a45d-429f-9f25-0da979224b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:48:58 compute-0 nova_compute[192567]: 2025-10-02 08:48:58.891 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:48:59 compute-0 nova_compute[192567]: 2025-10-02 08:48:59.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:48:59 compute-0 podman[203011]: time="2025-10-02T08:48:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:48:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:48:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:48:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:48:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3472 "" "Go-http-client/1.1"
Oct 02 08:49:00 compute-0 nova_compute[192567]: 2025-10-02 08:49:00.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:00 compute-0 nova_compute[192567]: 2025-10-02 08:49:00.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:49:00 compute-0 nova_compute[192567]: 2025-10-02 08:49:00.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:49:00 compute-0 nova_compute[192567]: 2025-10-02 08:49:00.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:49:01 compute-0 openstack_network_exporter[205118]: ERROR   08:49:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:49:01 compute-0 openstack_network_exporter[205118]: ERROR   08:49:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:49:01 compute-0 openstack_network_exporter[205118]: ERROR   08:49:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:49:01 compute-0 openstack_network_exporter[205118]: ERROR   08:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:49:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:49:01 compute-0 openstack_network_exporter[205118]: ERROR   08:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:49:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:49:01 compute-0 nova_compute[192567]: 2025-10-02 08:49:01.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:49:01 compute-0 nova_compute[192567]: 2025-10-02 08:49:01.652 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:49:01 compute-0 nova_compute[192567]: 2025-10-02 08:49:01.653 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:49:01 compute-0 nova_compute[192567]: 2025-10-02 08:49:01.653 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:49:01 compute-0 nova_compute[192567]: 2025-10-02 08:49:01.653 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:49:01 compute-0 nova_compute[192567]: 2025-10-02 08:49:01.732 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:49:01 compute-0 nova_compute[192567]: 2025-10-02 08:49:01.795 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:49:01 compute-0 nova_compute[192567]: 2025-10-02 08:49:01.797 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:49:01 compute-0 nova_compute[192567]: 2025-10-02 08:49:01.862 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:49:02 compute-0 nova_compute[192567]: 2025-10-02 08:49:02.116 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:49:02 compute-0 nova_compute[192567]: 2025-10-02 08:49:02.118 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5644MB free_disk=73.43273162841797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:49:02 compute-0 nova_compute[192567]: 2025-10-02 08:49:02.118 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:49:02 compute-0 nova_compute[192567]: 2025-10-02 08:49:02.119 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:49:02 compute-0 nova_compute[192567]: 2025-10-02 08:49:02.206 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance ad53b0ad-a45d-429f-9f25-0da979224b83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:49:02 compute-0 nova_compute[192567]: 2025-10-02 08:49:02.206 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:49:02 compute-0 nova_compute[192567]: 2025-10-02 08:49:02.206 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:49:02 compute-0 nova_compute[192567]: 2025-10-02 08:49:02.306 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:49:02 compute-0 nova_compute[192567]: 2025-10-02 08:49:02.328 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:49:02 compute-0 nova_compute[192567]: 2025-10-02 08:49:02.356 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:49:02 compute-0 nova_compute[192567]: 2025-10-02 08:49:02.356 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:49:03 compute-0 nova_compute[192567]: 2025-10-02 08:49:03.356 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:49:04 compute-0 nova_compute[192567]: 2025-10-02 08:49:04.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:05 compute-0 nova_compute[192567]: 2025-10-02 08:49:05.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:06 compute-0 podman[229882]: 2025-10-02 08:49:06.212768779 +0000 UTC m=+0.089077634 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:49:06 compute-0 nova_compute[192567]: 2025-10-02 08:49:06.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:49:07 compute-0 nova_compute[192567]: 2025-10-02 08:49:07.627 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:49:07 compute-0 nova_compute[192567]: 2025-10-02 08:49:07.628 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:49:09 compute-0 nova_compute[192567]: 2025-10-02 08:49:09.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:10 compute-0 nova_compute[192567]: 2025-10-02 08:49:10.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:10 compute-0 nova_compute[192567]: 2025-10-02 08:49:10.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:49:14 compute-0 nova_compute[192567]: 2025-10-02 08:49:14.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:15 compute-0 nova_compute[192567]: 2025-10-02 08:49:15.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:19 compute-0 nova_compute[192567]: 2025-10-02 08:49:19.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:20 compute-0 nova_compute[192567]: 2025-10-02 08:49:20.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:21 compute-0 podman[229908]: 2025-10-02 08:49:21.194857661 +0000 UTC m=+0.082048865 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:49:21 compute-0 podman[229906]: 2025-10-02 08:49:21.205366718 +0000 UTC m=+0.102461301 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:49:21 compute-0 podman[229909]: 2025-10-02 08:49:21.21730443 +0000 UTC m=+0.089047704 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 02 08:49:21 compute-0 podman[229907]: 2025-10-02 08:49:21.260600108 +0000 UTC m=+0.145972316 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct 02 08:49:24 compute-0 nova_compute[192567]: 2025-10-02 08:49:24.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:25 compute-0 nova_compute[192567]: 2025-10-02 08:49:25.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:27 compute-0 podman[229984]: 2025-10-02 08:49:27.165794535 +0000 UTC m=+0.071380443 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:49:29 compute-0 nova_compute[192567]: 2025-10-02 08:49:29.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:29 compute-0 podman[203011]: time="2025-10-02T08:49:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:49:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:49:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:49:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:49:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3476 "" "Go-http-client/1.1"
Oct 02 08:49:30 compute-0 nova_compute[192567]: 2025-10-02 08:49:30.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:31 compute-0 openstack_network_exporter[205118]: ERROR   08:49:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:49:31 compute-0 openstack_network_exporter[205118]: ERROR   08:49:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:49:31 compute-0 openstack_network_exporter[205118]: ERROR   08:49:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:49:31 compute-0 openstack_network_exporter[205118]: ERROR   08:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:49:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:49:31 compute-0 openstack_network_exporter[205118]: ERROR   08:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:49:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:49:34 compute-0 nova_compute[192567]: 2025-10-02 08:49:34.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:35 compute-0 nova_compute[192567]: 2025-10-02 08:49:35.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:37 compute-0 podman[230008]: 2025-10-02 08:49:37.195304466 +0000 UTC m=+0.095576197 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Oct 02 08:49:39 compute-0 nova_compute[192567]: 2025-10-02 08:49:39.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:40 compute-0 nova_compute[192567]: 2025-10-02 08:49:40.178 2 DEBUG nova.virt.libvirt.driver [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Creating tmpfile /var/lib/nova/instances/tmpcgbxnfbd to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Oct 02 08:49:40 compute-0 nova_compute[192567]: 2025-10-02 08:49:40.275 2 DEBUG nova.compute.manager [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcgbxnfbd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Oct 02 08:49:40 compute-0 nova_compute[192567]: 2025-10-02 08:49:40.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:41 compute-0 nova_compute[192567]: 2025-10-02 08:49:41.056 2 DEBUG nova.compute.manager [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcgbxnfbd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1cb5175a-2ebc-4a24-a337-18818506b843',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Oct 02 08:49:41 compute-0 nova_compute[192567]: 2025-10-02 08:49:41.090 2 DEBUG oslo_concurrency.lockutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-1cb5175a-2ebc-4a24-a337-18818506b843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:49:41 compute-0 nova_compute[192567]: 2025-10-02 08:49:41.090 2 DEBUG oslo_concurrency.lockutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-1cb5175a-2ebc-4a24-a337-18818506b843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:49:41 compute-0 nova_compute[192567]: 2025-10-02 08:49:41.091 2 DEBUG nova.network.neutron [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.235 2 DEBUG nova.network.neutron [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Updating instance_info_cache with network_info: [{"id": "fabf06ed-2e57-414c-a22a-c1211852e0e0", "address": "fa:16:3e:93:ad:bc", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfabf06ed-2e", "ovs_interfaceid": "fabf06ed-2e57-414c-a22a-c1211852e0e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.253 2 DEBUG oslo_concurrency.lockutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-1cb5175a-2ebc-4a24-a337-18818506b843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.255 2 DEBUG nova.virt.libvirt.driver [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcgbxnfbd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1cb5175a-2ebc-4a24-a337-18818506b843',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.256 2 DEBUG nova.virt.libvirt.driver [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Creating instance directory: /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.257 2 DEBUG nova.virt.libvirt.driver [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Creating disk.info with the contents: {'/var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk': 'qcow2', '/var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.258 2 DEBUG nova.virt.libvirt.driver [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.259 2 DEBUG nova.objects.instance [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1cb5175a-2ebc-4a24-a337-18818506b843 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.299 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.395 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.396 2 DEBUG oslo_concurrency.lockutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "12631388dc43f98e9873c2b420db3037f701853e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.397 2 DEBUG oslo_concurrency.lockutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.412 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.485 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.486 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.579 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e,backing_fmt=raw /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk 1073741824" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.582 2 DEBUG oslo_concurrency.lockutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "12631388dc43f98e9873c2b420db3037f701853e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.583 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.666 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/12631388dc43f98e9873c2b420db3037f701853e --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.668 2 DEBUG nova.virt.disk.api [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Checking if we can resize image /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.669 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.754 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.755 2 DEBUG nova.virt.disk.api [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Cannot resize image /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.756 2 DEBUG nova.objects.instance [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lazy-loading 'migration_context' on Instance uuid 1cb5175a-2ebc-4a24-a337-18818506b843 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.795 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.831 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk.config 485376" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.834 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk.config to /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Oct 02 08:49:43 compute-0 nova_compute[192567]: 2025-10-02 08:49:43.834 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk.config /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.391 2 DEBUG oslo_concurrency.processutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk.config /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843" returned: 0 in 0.557s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.393 2 DEBUG nova.virt.libvirt.driver [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.395 2 DEBUG nova.virt.libvirt.vif [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:47:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1681371285',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1681371285',id=37,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:48:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ac74327999c44d3eb46d6c8280531147',ramdisk_id='',reservation_id='r-t6vpxvgl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1420949844',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1420949844-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-02T08:48:02Z,user_data=None,user_id='796383ba775b4e63982ab22e1ab7e3e4',uuid=1cb5175a-2ebc-4a24-a337-18818506b843,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fabf06ed-2e57-414c-a22a-c1211852e0e0", "address": "fa:16:3e:93:ad:bc", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfabf06ed-2e", "ovs_interfaceid": "fabf06ed-2e57-414c-a22a-c1211852e0e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.396 2 DEBUG nova.network.os_vif_util [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converting VIF {"id": "fabf06ed-2e57-414c-a22a-c1211852e0e0", "address": "fa:16:3e:93:ad:bc", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfabf06ed-2e", "ovs_interfaceid": "fabf06ed-2e57-414c-a22a-c1211852e0e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.398 2 DEBUG nova.network.os_vif_util [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:93:ad:bc,bridge_name='br-int',has_traffic_filtering=True,id=fabf06ed-2e57-414c-a22a-c1211852e0e0,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfabf06ed-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.399 2 DEBUG os_vif [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:ad:bc,bridge_name='br-int',has_traffic_filtering=True,id=fabf06ed-2e57-414c-a22a-c1211852e0e0,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfabf06ed-2e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.401 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.402 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.408 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfabf06ed-2e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.409 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfabf06ed-2e, col_values=(('external_ids', {'iface-id': 'fabf06ed-2e57-414c-a22a-c1211852e0e0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:93:ad:bc', 'vm-uuid': '1cb5175a-2ebc-4a24-a337-18818506b843'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:44 compute-0 NetworkManager[51654]: <info>  [1759394984.4633] manager: (tapfabf06ed-2e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.476 2 INFO os_vif [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:ad:bc,bridge_name='br-int',has_traffic_filtering=True,id=fabf06ed-2e57-414c-a22a-c1211852e0e0,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfabf06ed-2e')
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.476 2 DEBUG nova.virt.libvirt.driver [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.477 2 DEBUG nova.compute.manager [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcgbxnfbd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1cb5175a-2ebc-4a24-a337-18818506b843',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Oct 02 08:49:44 compute-0 nova_compute[192567]: 2025-10-02 08:49:44.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:45 compute-0 nova_compute[192567]: 2025-10-02 08:49:45.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:45.359 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:49:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:45.362 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:49:45 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:45.363 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:49:45 compute-0 nova_compute[192567]: 2025-10-02 08:49:45.867 2 DEBUG nova.network.neutron [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Port fabf06ed-2e57-414c-a22a-c1211852e0e0 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Oct 02 08:49:45 compute-0 nova_compute[192567]: 2025-10-02 08:49:45.870 2 DEBUG nova.compute.manager [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcgbxnfbd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1cb5175a-2ebc-4a24-a337-18818506b843',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Oct 02 08:49:46 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 02 08:49:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:46.015 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:49:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:46.016 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:49:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:46.017 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:49:46 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 02 08:49:46 compute-0 kernel: tapfabf06ed-2e: entered promiscuous mode
Oct 02 08:49:46 compute-0 NetworkManager[51654]: <info>  [1759394986.2508] manager: (tapfabf06ed-2e): new Tun device (/org/freedesktop/NetworkManager/Devices/102)
Oct 02 08:49:46 compute-0 ovn_controller[94821]: 2025-10-02T08:49:46Z|00260|binding|INFO|Claiming lport fabf06ed-2e57-414c-a22a-c1211852e0e0 for this additional chassis.
Oct 02 08:49:46 compute-0 ovn_controller[94821]: 2025-10-02T08:49:46Z|00261|binding|INFO|fabf06ed-2e57-414c-a22a-c1211852e0e0: Claiming fa:16:3e:93:ad:bc 10.100.0.11
Oct 02 08:49:46 compute-0 nova_compute[192567]: 2025-10-02 08:49:46.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:46 compute-0 ovn_controller[94821]: 2025-10-02T08:49:46Z|00262|binding|INFO|Setting lport fabf06ed-2e57-414c-a22a-c1211852e0e0 ovn-installed in OVS
Oct 02 08:49:46 compute-0 nova_compute[192567]: 2025-10-02 08:49:46.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:46 compute-0 nova_compute[192567]: 2025-10-02 08:49:46.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:46 compute-0 systemd-udevd[230082]: Network interface NamePolicy= disabled on kernel command line.
Oct 02 08:49:46 compute-0 systemd-machined[152597]: New machine qemu-25-instance-00000025.
Oct 02 08:49:46 compute-0 NetworkManager[51654]: <info>  [1759394986.3307] device (tapfabf06ed-2e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 02 08:49:46 compute-0 NetworkManager[51654]: <info>  [1759394986.3316] device (tapfabf06ed-2e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 02 08:49:46 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-00000025.
Oct 02 08:49:48 compute-0 nova_compute[192567]: 2025-10-02 08:49:48.021 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394988.0207398, 1cb5175a-2ebc-4a24-a337-18818506b843 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:49:48 compute-0 nova_compute[192567]: 2025-10-02 08:49:48.022 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] VM Started (Lifecycle Event)
Oct 02 08:49:48 compute-0 nova_compute[192567]: 2025-10-02 08:49:48.054 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:49:49 compute-0 nova_compute[192567]: 2025-10-02 08:49:49.030 2 DEBUG nova.virt.driver [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] Emitting event <LifecycleEvent: 1759394989.0290616, 1cb5175a-2ebc-4a24-a337-18818506b843 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:49:49 compute-0 nova_compute[192567]: 2025-10-02 08:49:49.031 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] VM Resumed (Lifecycle Event)
Oct 02 08:49:49 compute-0 nova_compute[192567]: 2025-10-02 08:49:49.055 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:49:49 compute-0 nova_compute[192567]: 2025-10-02 08:49:49.058 2 DEBUG nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 02 08:49:49 compute-0 nova_compute[192567]: 2025-10-02 08:49:49.084 2 INFO nova.compute.manager [None req-66592abd-315a-4229-8ce1-85a90649ae56 - - - - - -] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Oct 02 08:49:49 compute-0 nova_compute[192567]: 2025-10-02 08:49:49.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:49 compute-0 nova_compute[192567]: 2025-10-02 08:49:49.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:50 compute-0 ovn_controller[94821]: 2025-10-02T08:49:50Z|00263|binding|INFO|Claiming lport fabf06ed-2e57-414c-a22a-c1211852e0e0 for this chassis.
Oct 02 08:49:50 compute-0 ovn_controller[94821]: 2025-10-02T08:49:50Z|00264|binding|INFO|fabf06ed-2e57-414c-a22a-c1211852e0e0: Claiming fa:16:3e:93:ad:bc 10.100.0.11
Oct 02 08:49:50 compute-0 ovn_controller[94821]: 2025-10-02T08:49:50Z|00265|binding|INFO|Setting lport fabf06ed-2e57-414c-a22a-c1211852e0e0 up in Southbound
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.153 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:ad:bc 10.100.0.11'], port_security=['fa:16:3e:93:ad:bc 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1cb5175a-2ebc-4a24-a337-18818506b843', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac74327999c44d3eb46d6c8280531147', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'ed8f6343-b875-4638-92b8-f2446903f322', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a95bbfad-5b25-4f75-9394-85dc68a3a437, chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=fabf06ed-2e57-414c-a22a-c1211852e0e0) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.156 103703 INFO neutron.agent.ovn.metadata.agent [-] Port fabf06ed-2e57-414c-a22a-c1211852e0e0 in datapath dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d bound to our chassis
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.159 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.181 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[238c5d98-acf3-463b-9a50-33e9630bf06c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.220 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[a08d2b95-d525-4e47-a708-7fd2a579c2c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.225 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[97ad107b-6d2c-485e-80a3-c8b15d0156ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.264 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[dffcdb79-4ab5-4ad0-be28-2b9e52ecb391]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.280 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[6c896ee2-83f9-4173-ac84-369af1d61ec3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdfb4a911-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:27:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 5, 'rx_bytes': 1126, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 5, 'rx_bytes': 1126, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578381, 'reachable_time': 19308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230118, 'error': None, 'target': 'ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.302 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[76845735-4658-464d-8c73-e0b54812f4ca]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdfb4a911-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 578399, 'tstamp': 578399}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230119, 'error': None, 'target': 'ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdfb4a911-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 578404, 'tstamp': 578404}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230119, 'error': None, 'target': 'ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.305 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdfb4a911-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.341 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdfb4a911-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.342 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.343 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdfb4a911-c0, col_values=(('external_ids', {'iface-id': 'e37e4669-0eee-4206-a367-e3f45a77615a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:49:50 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:49:50.343 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:49:50 compute-0 nova_compute[192567]: 2025-10-02 08:49:50.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:50 compute-0 nova_compute[192567]: 2025-10-02 08:49:50.488 2 INFO nova.compute.manager [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Post operation of migration started
Oct 02 08:49:50 compute-0 nova_compute[192567]: 2025-10-02 08:49:50.775 2 DEBUG oslo_concurrency.lockutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "refresh_cache-1cb5175a-2ebc-4a24-a337-18818506b843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:49:50 compute-0 nova_compute[192567]: 2025-10-02 08:49:50.775 2 DEBUG oslo_concurrency.lockutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquired lock "refresh_cache-1cb5175a-2ebc-4a24-a337-18818506b843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:49:50 compute-0 nova_compute[192567]: 2025-10-02 08:49:50.776 2 DEBUG nova.network.neutron [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 02 08:49:52 compute-0 podman[230120]: 2025-10-02 08:49:52.210328222 +0000 UTC m=+0.103501343 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:49:52 compute-0 podman[230128]: 2025-10-02 08:49:52.214192542 +0000 UTC m=+0.079299769 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:49:52 compute-0 podman[230122]: 2025-10-02 08:49:52.22567699 +0000 UTC m=+0.099821339 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 02 08:49:52 compute-0 podman[230121]: 2025-10-02 08:49:52.243771554 +0000 UTC m=+0.130710621 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct 02 08:49:54 compute-0 nova_compute[192567]: 2025-10-02 08:49:54.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:54 compute-0 nova_compute[192567]: 2025-10-02 08:49:54.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:55 compute-0 nova_compute[192567]: 2025-10-02 08:49:55.134 2 DEBUG nova.network.neutron [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Updating instance_info_cache with network_info: [{"id": "fabf06ed-2e57-414c-a22a-c1211852e0e0", "address": "fa:16:3e:93:ad:bc", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfabf06ed-2e", "ovs_interfaceid": "fabf06ed-2e57-414c-a22a-c1211852e0e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:49:55 compute-0 nova_compute[192567]: 2025-10-02 08:49:55.154 2 DEBUG oslo_concurrency.lockutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Releasing lock "refresh_cache-1cb5175a-2ebc-4a24-a337-18818506b843" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:49:55 compute-0 nova_compute[192567]: 2025-10-02 08:49:55.173 2 DEBUG oslo_concurrency.lockutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:49:55 compute-0 nova_compute[192567]: 2025-10-02 08:49:55.174 2 DEBUG oslo_concurrency.lockutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:49:55 compute-0 nova_compute[192567]: 2025-10-02 08:49:55.174 2 DEBUG oslo_concurrency.lockutils [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:49:55 compute-0 nova_compute[192567]: 2025-10-02 08:49:55.180 2 INFO nova.virt.libvirt.driver [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 02 08:49:55 compute-0 virtqemud[192112]: Domain id=25 name='instance-00000025' uuid=1cb5175a-2ebc-4a24-a337-18818506b843 is tainted: custom-monitor
Oct 02 08:49:56 compute-0 nova_compute[192567]: 2025-10-02 08:49:56.188 2 INFO nova.virt.libvirt.driver [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 02 08:49:56 compute-0 nova_compute[192567]: 2025-10-02 08:49:56.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:49:56 compute-0 nova_compute[192567]: 2025-10-02 08:49:56.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:49:56 compute-0 nova_compute[192567]: 2025-10-02 08:49:56.626 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:49:56 compute-0 nova_compute[192567]: 2025-10-02 08:49:56.843 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "refresh_cache-ad53b0ad-a45d-429f-9f25-0da979224b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 02 08:49:56 compute-0 nova_compute[192567]: 2025-10-02 08:49:56.844 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquired lock "refresh_cache-ad53b0ad-a45d-429f-9f25-0da979224b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 02 08:49:56 compute-0 nova_compute[192567]: 2025-10-02 08:49:56.844 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 02 08:49:56 compute-0 nova_compute[192567]: 2025-10-02 08:49:56.844 2 DEBUG nova.objects.instance [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ad53b0ad-a45d-429f-9f25-0da979224b83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:49:57 compute-0 nova_compute[192567]: 2025-10-02 08:49:57.199 2 INFO nova.virt.libvirt.driver [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 02 08:49:57 compute-0 nova_compute[192567]: 2025-10-02 08:49:57.206 2 DEBUG nova.compute.manager [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:49:57 compute-0 nova_compute[192567]: 2025-10-02 08:49:57.234 2 DEBUG nova.objects.instance [None req-74645b72-98c9-4a0c-950f-c7dcfd994bee f8c7603494c74312835ee6fd77fe978c 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Oct 02 08:49:58 compute-0 podman[230200]: 2025-10-02 08:49:58.159794525 +0000 UTC m=+0.071757904 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:49:58 compute-0 nova_compute[192567]: 2025-10-02 08:49:58.419 2 DEBUG nova.network.neutron [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Updating instance_info_cache with network_info: [{"id": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "address": "fa:16:3e:2a:4f:82", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b981388-21", "ovs_interfaceid": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:49:58 compute-0 nova_compute[192567]: 2025-10-02 08:49:58.570 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Releasing lock "refresh_cache-ad53b0ad-a45d-429f-9f25-0da979224b83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 02 08:49:58 compute-0 nova_compute[192567]: 2025-10-02 08:49:58.570 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 02 08:49:59 compute-0 nova_compute[192567]: 2025-10-02 08:49:59.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:59 compute-0 nova_compute[192567]: 2025-10-02 08:49:59.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:49:59 compute-0 podman[203011]: time="2025-10-02T08:49:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:49:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:49:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20780 "" "Go-http-client/1.1"
Oct 02 08:49:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:49:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3476 "" "Go-http-client/1.1"
Oct 02 08:50:00 compute-0 nova_compute[192567]: 2025-10-02 08:50:00.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:50:00 compute-0 nova_compute[192567]: 2025-10-02 08:50:00.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:50:01 compute-0 openstack_network_exporter[205118]: ERROR   08:50:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:50:01 compute-0 openstack_network_exporter[205118]: ERROR   08:50:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:50:01 compute-0 openstack_network_exporter[205118]: ERROR   08:50:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:50:01 compute-0 openstack_network_exporter[205118]: ERROR   08:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:50:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:50:01 compute-0 openstack_network_exporter[205118]: ERROR   08:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:50:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.657 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.658 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.658 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.658 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.743 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.831 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.832 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.928 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.936 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.977 2 DEBUG oslo_concurrency.lockutils [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "ad53b0ad-a45d-429f-9f25-0da979224b83" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.978 2 DEBUG oslo_concurrency.lockutils [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.979 2 DEBUG oslo_concurrency.lockutils [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.979 2 DEBUG oslo_concurrency.lockutils [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.979 2 DEBUG oslo_concurrency.lockutils [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.981 2 INFO nova.compute.manager [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Terminating instance
Oct 02 08:50:01 compute-0 nova_compute[192567]: 2025-10-02 08:50:01.982 2 DEBUG nova.compute.manager [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.000 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.002 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:50:02 compute-0 kernel: tap6b981388-21 (unregistering): left promiscuous mode
Oct 02 08:50:02 compute-0 NetworkManager[51654]: <info>  [1759395002.0190] device (tap6b981388-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:02 compute-0 ovn_controller[94821]: 2025-10-02T08:50:02Z|00266|binding|INFO|Releasing lport 6b981388-21bd-4946-8422-9a7fcd19cd4a from this chassis (sb_readonly=0)
Oct 02 08:50:02 compute-0 ovn_controller[94821]: 2025-10-02T08:50:02Z|00267|binding|INFO|Setting lport 6b981388-21bd-4946-8422-9a7fcd19cd4a down in Southbound
Oct 02 08:50:02 compute-0 ovn_controller[94821]: 2025-10-02T08:50:02Z|00268|binding|INFO|Removing iface tap6b981388-21 ovn-installed in OVS
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.050 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:4f:82 10.100.0.5'], port_security=['fa:16:3e:2a:4f:82 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ad53b0ad-a45d-429f-9f25-0da979224b83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac74327999c44d3eb46d6c8280531147', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed8f6343-b875-4638-92b8-f2446903f322', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a95bbfad-5b25-4f75-9394-85dc68a3a437, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=6b981388-21bd-4946-8422-9a7fcd19cd4a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.054 103703 INFO neutron.agent.ovn.metadata.agent [-] Port 6b981388-21bd-4946-8422-9a7fcd19cd4a in datapath dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d unbound from our chassis
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.057 103703 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.087 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[0dc9f153-e98d-41d9-bf04-65511d6d48bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.108 2 DEBUG oslo_concurrency.processutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83/disk --force-share --output=json" returned: 0 in 0.107s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:50:02 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000026.scope: Deactivated successfully.
Oct 02 08:50:02 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000026.scope: Consumed 17.901s CPU time.
Oct 02 08:50:02 compute-0 systemd-machined[152597]: Machine qemu-24-instance-00000026 terminated.
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.142 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4c6859-6c2f-480c-a6fc-a708bf28fd80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.149 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[7a0f94a0-44e3-4fb2-b559-148ae65322ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.190 215209 DEBUG oslo.privsep.daemon [-] privsep: reply[f2282d12-a59b-4261-adfa-621f45e91014]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.225 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[f3a6558f-2c3e-4bd6-954c-0961e544b588]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdfb4a911-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:27:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578381, 'reachable_time': 19308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230253, 'error': None, 'target': 'ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.256 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[2e73f488-f815-47e4-a891-98be99ec4967]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdfb4a911-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 578399, 'tstamp': 578399}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230265, 'error': None, 'target': 'ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdfb4a911-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 578404, 'tstamp': 578404}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230265, 'error': None, 'target': 'ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.257 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdfb4a911-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.264 2 INFO nova.virt.libvirt.driver [-] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Instance destroyed successfully.
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.265 2 DEBUG nova.objects.instance [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lazy-loading 'resources' on Instance uuid ad53b0ad-a45d-429f-9f25-0da979224b83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.269 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdfb4a911-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.269 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.270 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdfb4a911-c0, col_values=(('external_ids', {'iface-id': 'e37e4669-0eee-4206-a367-e3f45a77615a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:50:02 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:02.270 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.319 2 DEBUG nova.virt.libvirt.vif [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-02T08:48:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-319216127',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-319216127',id=38,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:48:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac74327999c44d3eb46d6c8280531147',ramdisk_id='',reservation_id='r-9wnpj0to',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1420949844',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1420949844-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:48:17Z,user_data=None,user_id='796383ba775b4e63982ab22e1ab7e3e4',uuid=ad53b0ad-a45d-429f-9f25-0da979224b83,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "address": "fa:16:3e:2a:4f:82", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b981388-21", "ovs_interfaceid": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.319 2 DEBUG nova.network.os_vif_util [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Converting VIF {"id": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "address": "fa:16:3e:2a:4f:82", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b981388-21", "ovs_interfaceid": "6b981388-21bd-4946-8422-9a7fcd19cd4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.320 2 DEBUG nova.network.os_vif_util [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2a:4f:82,bridge_name='br-int',has_traffic_filtering=True,id=6b981388-21bd-4946-8422-9a7fcd19cd4a,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b981388-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.321 2 DEBUG os_vif [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2a:4f:82,bridge_name='br-int',has_traffic_filtering=True,id=6b981388-21bd-4946-8422-9a7fcd19cd4a,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b981388-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.323 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b981388-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.328 2 INFO os_vif [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2a:4f:82,bridge_name='br-int',has_traffic_filtering=True,id=6b981388-21bd-4946-8422-9a7fcd19cd4a,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b981388-21')
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.329 2 INFO nova.virt.libvirt.driver [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Deleting instance files /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83_del
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.329 2 INFO nova.virt.libvirt.driver [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Deletion of /var/lib/nova/instances/ad53b0ad-a45d-429f-9f25-0da979224b83_del complete
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.360 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.361 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5479MB free_disk=73.40378189086914GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.362 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.362 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.388 2 INFO nova.compute.manager [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Took 0.40 seconds to destroy the instance on the hypervisor.
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.388 2 DEBUG oslo.service.loopingcall [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.390 2 DEBUG nova.compute.manager [-] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.390 2 DEBUG nova.network.neutron [-] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.447 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance ad53b0ad-a45d-429f-9f25-0da979224b83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.447 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Instance 1cb5175a-2ebc-4a24-a337-18818506b843 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.447 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.448 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.509 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.521 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.539 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:50:02 compute-0 nova_compute[192567]: 2025-10-02 08:50:02.540 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:03 compute-0 nova_compute[192567]: 2025-10-02 08:50:03.101 2 DEBUG nova.compute.manager [req-37a2dfe6-af42-4fef-bd82-3680687bbbed req-6f247eda-18be-4298-8d8f-d9a387a077df 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Received event network-vif-unplugged-6b981388-21bd-4946-8422-9a7fcd19cd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:50:03 compute-0 nova_compute[192567]: 2025-10-02 08:50:03.102 2 DEBUG oslo_concurrency.lockutils [req-37a2dfe6-af42-4fef-bd82-3680687bbbed req-6f247eda-18be-4298-8d8f-d9a387a077df 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:03 compute-0 nova_compute[192567]: 2025-10-02 08:50:03.103 2 DEBUG oslo_concurrency.lockutils [req-37a2dfe6-af42-4fef-bd82-3680687bbbed req-6f247eda-18be-4298-8d8f-d9a387a077df 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:03 compute-0 nova_compute[192567]: 2025-10-02 08:50:03.103 2 DEBUG oslo_concurrency.lockutils [req-37a2dfe6-af42-4fef-bd82-3680687bbbed req-6f247eda-18be-4298-8d8f-d9a387a077df 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:03 compute-0 nova_compute[192567]: 2025-10-02 08:50:03.104 2 DEBUG nova.compute.manager [req-37a2dfe6-af42-4fef-bd82-3680687bbbed req-6f247eda-18be-4298-8d8f-d9a387a077df 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] No waiting events found dispatching network-vif-unplugged-6b981388-21bd-4946-8422-9a7fcd19cd4a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:50:03 compute-0 nova_compute[192567]: 2025-10-02 08:50:03.104 2 DEBUG nova.compute.manager [req-37a2dfe6-af42-4fef-bd82-3680687bbbed req-6f247eda-18be-4298-8d8f-d9a387a077df 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Received event network-vif-unplugged-6b981388-21bd-4946-8422-9a7fcd19cd4a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:50:03 compute-0 nova_compute[192567]: 2025-10-02 08:50:03.539 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:50:03 compute-0 nova_compute[192567]: 2025-10-02 08:50:03.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:50:04 compute-0 nova_compute[192567]: 2025-10-02 08:50:04.070 2 DEBUG nova.network.neutron [-] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:50:04 compute-0 nova_compute[192567]: 2025-10-02 08:50:04.095 2 INFO nova.compute.manager [-] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Took 1.70 seconds to deallocate network for instance.
Oct 02 08:50:04 compute-0 nova_compute[192567]: 2025-10-02 08:50:04.173 2 DEBUG oslo_concurrency.lockutils [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:04 compute-0 nova_compute[192567]: 2025-10-02 08:50:04.174 2 DEBUG oslo_concurrency.lockutils [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:04 compute-0 nova_compute[192567]: 2025-10-02 08:50:04.191 2 DEBUG nova.compute.manager [req-b1ec6735-9aa1-45f1-b4b4-8fe5f1508726 req-7857f6d4-e640-4a15-9d79-d296a0c4b1e1 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Received event network-vif-deleted-6b981388-21bd-4946-8422-9a7fcd19cd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:50:04 compute-0 nova_compute[192567]: 2025-10-02 08:50:04.238 2 DEBUG nova.compute.provider_tree [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:50:04 compute-0 nova_compute[192567]: 2025-10-02 08:50:04.253 2 DEBUG nova.scheduler.client.report [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:50:04 compute-0 nova_compute[192567]: 2025-10-02 08:50:04.286 2 DEBUG oslo_concurrency.lockutils [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:04 compute-0 nova_compute[192567]: 2025-10-02 08:50:04.327 2 INFO nova.scheduler.client.report [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Deleted allocations for instance ad53b0ad-a45d-429f-9f25-0da979224b83
Oct 02 08:50:04 compute-0 nova_compute[192567]: 2025-10-02 08:50:04.413 2 DEBUG oslo_concurrency.lockutils [None req-11db3bd3-cf89-4061-8890-85b5f50eed29 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.435s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:04 compute-0 nova_compute[192567]: 2025-10-02 08:50:04.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:05 compute-0 nova_compute[192567]: 2025-10-02 08:50:05.220 2 DEBUG nova.compute.manager [req-6bae6738-d79a-4351-839b-cce422b99831 req-7dc19692-44c6-4aa6-becb-940f2f0baaf8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Received event network-vif-plugged-6b981388-21bd-4946-8422-9a7fcd19cd4a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:50:05 compute-0 nova_compute[192567]: 2025-10-02 08:50:05.220 2 DEBUG oslo_concurrency.lockutils [req-6bae6738-d79a-4351-839b-cce422b99831 req-7dc19692-44c6-4aa6-becb-940f2f0baaf8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:05 compute-0 nova_compute[192567]: 2025-10-02 08:50:05.221 2 DEBUG oslo_concurrency.lockutils [req-6bae6738-d79a-4351-839b-cce422b99831 req-7dc19692-44c6-4aa6-becb-940f2f0baaf8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:05 compute-0 nova_compute[192567]: 2025-10-02 08:50:05.221 2 DEBUG oslo_concurrency.lockutils [req-6bae6738-d79a-4351-839b-cce422b99831 req-7dc19692-44c6-4aa6-becb-940f2f0baaf8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "ad53b0ad-a45d-429f-9f25-0da979224b83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:05 compute-0 nova_compute[192567]: 2025-10-02 08:50:05.222 2 DEBUG nova.compute.manager [req-6bae6738-d79a-4351-839b-cce422b99831 req-7dc19692-44c6-4aa6-becb-940f2f0baaf8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] No waiting events found dispatching network-vif-plugged-6b981388-21bd-4946-8422-9a7fcd19cd4a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:50:05 compute-0 nova_compute[192567]: 2025-10-02 08:50:05.222 2 WARNING nova.compute.manager [req-6bae6738-d79a-4351-839b-cce422b99831 req-7dc19692-44c6-4aa6-becb-940f2f0baaf8 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Received unexpected event network-vif-plugged-6b981388-21bd-4946-8422-9a7fcd19cd4a for instance with vm_state deleted and task_state None.
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.172 2 DEBUG oslo_concurrency.lockutils [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "1cb5175a-2ebc-4a24-a337-18818506b843" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.173 2 DEBUG oslo_concurrency.lockutils [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "1cb5175a-2ebc-4a24-a337-18818506b843" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.174 2 DEBUG oslo_concurrency.lockutils [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "1cb5175a-2ebc-4a24-a337-18818506b843-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.174 2 DEBUG oslo_concurrency.lockutils [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "1cb5175a-2ebc-4a24-a337-18818506b843-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.175 2 DEBUG oslo_concurrency.lockutils [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "1cb5175a-2ebc-4a24-a337-18818506b843-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.177 2 INFO nova.compute.manager [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Terminating instance
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.179 2 DEBUG nova.compute.manager [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 02 08:50:06 compute-0 kernel: tapfabf06ed-2e (unregistering): left promiscuous mode
Oct 02 08:50:06 compute-0 NetworkManager[51654]: <info>  [1759395006.2109] device (tapfabf06ed-2e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:06 compute-0 ovn_controller[94821]: 2025-10-02T08:50:06Z|00269|binding|INFO|Releasing lport fabf06ed-2e57-414c-a22a-c1211852e0e0 from this chassis (sb_readonly=0)
Oct 02 08:50:06 compute-0 ovn_controller[94821]: 2025-10-02T08:50:06Z|00270|binding|INFO|Setting lport fabf06ed-2e57-414c-a22a-c1211852e0e0 down in Southbound
Oct 02 08:50:06 compute-0 ovn_controller[94821]: 2025-10-02T08:50:06Z|00271|binding|INFO|Removing iface tapfabf06ed-2e ovn-installed in OVS
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.239 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:ad:bc 10.100.0.11'], port_security=['fa:16:3e:93:ad:bc 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1cb5175a-2ebc-4a24-a337-18818506b843', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ac74327999c44d3eb46d6c8280531147', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'ed8f6343-b875-4638-92b8-f2446903f322', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a95bbfad-5b25-4f75-9394-85dc68a3a437, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>], logical_port=fabf06ed-2e57-414c-a22a-c1211852e0e0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fd4223df700>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.240 103703 INFO neutron.agent.ovn.metadata.agent [-] Port fabf06ed-2e57-414c-a22a-c1211852e0e0 in datapath dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d unbound from our chassis
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.242 103703 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.244 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[3282978a-abfb-4c91-96dc-76d1d8389425]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.245 103703 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d namespace which is not needed anymore
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:06 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000025.scope: Deactivated successfully.
Oct 02 08:50:06 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000025.scope: Consumed 3.178s CPU time.
Oct 02 08:50:06 compute-0 systemd-machined[152597]: Machine qemu-25-instance-00000025 terminated.
Oct 02 08:50:06 compute-0 neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d[229609]: [NOTICE]   (229613) : haproxy version is 2.8.14-c23fe91
Oct 02 08:50:06 compute-0 neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d[229609]: [NOTICE]   (229613) : path to executable is /usr/sbin/haproxy
Oct 02 08:50:06 compute-0 neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d[229609]: [WARNING]  (229613) : Exiting Master process...
Oct 02 08:50:06 compute-0 neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d[229609]: [ALERT]    (229613) : Current worker (229615) exited with code 143 (Terminated)
Oct 02 08:50:06 compute-0 neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d[229609]: [WARNING]  (229613) : All workers exited. Exiting... (0)
Oct 02 08:50:06 compute-0 systemd[1]: libpod-c42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383.scope: Deactivated successfully.
Oct 02 08:50:06 compute-0 podman[230295]: 2025-10-02 08:50:06.441690726 +0000 UTC m=+0.076820013 container died c42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.469 2 INFO nova.virt.libvirt.driver [-] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Instance destroyed successfully.
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.469 2 DEBUG nova.objects.instance [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lazy-loading 'resources' on Instance uuid 1cb5175a-2ebc-4a24-a337-18818506b843 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 02 08:50:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383-userdata-shm.mount: Deactivated successfully.
Oct 02 08:50:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-cc8eee427fe440cede31350228bc85150247d6bb7d58f66f98ba150f948b5a90-merged.mount: Deactivated successfully.
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.485 2 DEBUG nova.virt.libvirt.vif [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-02T08:47:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1681371285',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1681371285',id=37,image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-02T08:48:02Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ac74327999c44d3eb46d6c8280531147',ramdisk_id='',reservation_id='r-t6vpxvgl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',clean_attempts='1',image_base_image_ref='f5cf0efc-6f3c-4865-b002-490e9c9b250d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1420949844',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1420949844-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-02T08:49:57Z,user_data=None,user_id='796383ba775b4e63982ab22e1ab7e3e4',uuid=1cb5175a-2ebc-4a24-a337-18818506b843,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fabf06ed-2e57-414c-a22a-c1211852e0e0", "address": "fa:16:3e:93:ad:bc", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfabf06ed-2e", "ovs_interfaceid": "fabf06ed-2e57-414c-a22a-c1211852e0e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.486 2 DEBUG nova.network.os_vif_util [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Converting VIF {"id": "fabf06ed-2e57-414c-a22a-c1211852e0e0", "address": "fa:16:3e:93:ad:bc", "network": {"id": "dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-437715312-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "22b3ce61c450411ca6668f5f26f006c4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfabf06ed-2e", "ovs_interfaceid": "fabf06ed-2e57-414c-a22a-c1211852e0e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.487 2 DEBUG nova.network.os_vif_util [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:93:ad:bc,bridge_name='br-int',has_traffic_filtering=True,id=fabf06ed-2e57-414c-a22a-c1211852e0e0,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfabf06ed-2e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.487 2 DEBUG os_vif [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:ad:bc,bridge_name='br-int',has_traffic_filtering=True,id=fabf06ed-2e57-414c-a22a-c1211852e0e0,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfabf06ed-2e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:06 compute-0 podman[230295]: 2025-10-02 08:50:06.489898216 +0000 UTC m=+0.125027493 container cleanup c42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.490 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfabf06ed-2e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.501 2 INFO os_vif [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:ad:bc,bridge_name='br-int',has_traffic_filtering=True,id=fabf06ed-2e57-414c-a22a-c1211852e0e0,network=Network(dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfabf06ed-2e')
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.502 2 INFO nova.virt.libvirt.driver [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Deleting instance files /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843_del
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.503 2 INFO nova.virt.libvirt.driver [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Deletion of /var/lib/nova/instances/1cb5175a-2ebc-4a24-a337-18818506b843_del complete
Oct 02 08:50:06 compute-0 systemd[1]: libpod-conmon-c42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383.scope: Deactivated successfully.
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.561 2 INFO nova.compute.manager [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Took 0.38 seconds to destroy the instance on the hypervisor.
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.562 2 DEBUG oslo.service.loopingcall [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.563 2 DEBUG nova.compute.manager [-] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.564 2 DEBUG nova.network.neutron [-] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 02 08:50:06 compute-0 podman[230342]: 2025-10-02 08:50:06.575099479 +0000 UTC m=+0.056850400 container remove c42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.585 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[199f38e9-447b-42c1-9736-09f4da67edf2]: (4, ('Thu Oct  2 08:50:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d (c42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383)\nc42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383\nThu Oct  2 08:50:06 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d (c42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383)\nc42e9fab3b6bb54cf6e327bde04d5b2e3fcb36b8731bcd80671e0aabb0636383\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.587 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[565bb3b9-382f-4f51-b879-a4f6ee512395]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.588 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdfb4a911-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:06 compute-0 kernel: tapdfb4a911-c0: left promiscuous mode
Oct 02 08:50:06 compute-0 nova_compute[192567]: 2025-10-02 08:50:06.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.607 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[8aa86dbc-ae56-4ca7-aed1-57ac26132585]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.646 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[d352f2d4-8818-45a2-9fee-2cd1201c9a85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.648 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[5333a681-2348-4cbb-8085-e3d7d2c2c531]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.665 215188 DEBUG oslo.privsep.daemon [-] privsep: reply[670d5ec6-6a43-471a-8aee-93dd9c9a7d7b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 578367, 'reachable_time': 17874, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230357, 'error': None, 'target': 'ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:06 compute-0 systemd[1]: run-netns-ovnmeta\x2ddfb4a911\x2dc6ae\x2d4cd3\x2db8be\x2dc1dc0fbeaf7d.mount: Deactivated successfully.
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.671 103814 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dfb4a911-c6ae-4cd3-b8be-c1dc0fbeaf7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 02 08:50:06 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:06.672 103814 DEBUG oslo.privsep.daemon [-] privsep: reply[07072745-e8e9-45ec-9492-4bbd4e4c274a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.303 2 DEBUG nova.compute.manager [req-db05329c-0730-4fff-9ac2-75ef85ae9aeb req-25e715b7-e9d6-40ce-8d1c-b924d06e4981 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Received event network-vif-unplugged-fabf06ed-2e57-414c-a22a-c1211852e0e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.303 2 DEBUG oslo_concurrency.lockutils [req-db05329c-0730-4fff-9ac2-75ef85ae9aeb req-25e715b7-e9d6-40ce-8d1c-b924d06e4981 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "1cb5175a-2ebc-4a24-a337-18818506b843-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.304 2 DEBUG oslo_concurrency.lockutils [req-db05329c-0730-4fff-9ac2-75ef85ae9aeb req-25e715b7-e9d6-40ce-8d1c-b924d06e4981 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "1cb5175a-2ebc-4a24-a337-18818506b843-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.304 2 DEBUG oslo_concurrency.lockutils [req-db05329c-0730-4fff-9ac2-75ef85ae9aeb req-25e715b7-e9d6-40ce-8d1c-b924d06e4981 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "1cb5175a-2ebc-4a24-a337-18818506b843-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.304 2 DEBUG nova.compute.manager [req-db05329c-0730-4fff-9ac2-75ef85ae9aeb req-25e715b7-e9d6-40ce-8d1c-b924d06e4981 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] No waiting events found dispatching network-vif-unplugged-fabf06ed-2e57-414c-a22a-c1211852e0e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.305 2 DEBUG nova.compute.manager [req-db05329c-0730-4fff-9ac2-75ef85ae9aeb req-25e715b7-e9d6-40ce-8d1c-b924d06e4981 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Received event network-vif-unplugged-fabf06ed-2e57-414c-a22a-c1211852e0e0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.305 2 DEBUG nova.compute.manager [req-db05329c-0730-4fff-9ac2-75ef85ae9aeb req-25e715b7-e9d6-40ce-8d1c-b924d06e4981 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Received event network-vif-plugged-fabf06ed-2e57-414c-a22a-c1211852e0e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.305 2 DEBUG oslo_concurrency.lockutils [req-db05329c-0730-4fff-9ac2-75ef85ae9aeb req-25e715b7-e9d6-40ce-8d1c-b924d06e4981 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Acquiring lock "1cb5175a-2ebc-4a24-a337-18818506b843-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.306 2 DEBUG oslo_concurrency.lockutils [req-db05329c-0730-4fff-9ac2-75ef85ae9aeb req-25e715b7-e9d6-40ce-8d1c-b924d06e4981 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "1cb5175a-2ebc-4a24-a337-18818506b843-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.306 2 DEBUG oslo_concurrency.lockutils [req-db05329c-0730-4fff-9ac2-75ef85ae9aeb req-25e715b7-e9d6-40ce-8d1c-b924d06e4981 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] Lock "1cb5175a-2ebc-4a24-a337-18818506b843-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.306 2 DEBUG nova.compute.manager [req-db05329c-0730-4fff-9ac2-75ef85ae9aeb req-25e715b7-e9d6-40ce-8d1c-b924d06e4981 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] No waiting events found dispatching network-vif-plugged-fabf06ed-2e57-414c-a22a-c1211852e0e0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.307 2 WARNING nova.compute.manager [req-db05329c-0730-4fff-9ac2-75ef85ae9aeb req-25e715b7-e9d6-40ce-8d1c-b924d06e4981 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Received unexpected event network-vif-plugged-fabf06ed-2e57-414c-a22a-c1211852e0e0 for instance with vm_state active and task_state deleting.
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.385 2 DEBUG nova.network.neutron [-] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.403 2 INFO nova.compute.manager [-] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Took 0.84 seconds to deallocate network for instance.
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.463 2 DEBUG oslo_concurrency.lockutils [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.463 2 DEBUG oslo_concurrency.lockutils [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.509 2 DEBUG nova.compute.provider_tree [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.524 2 DEBUG nova.scheduler.client.report [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.548 2 DEBUG oslo_concurrency.lockutils [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.576 2 INFO nova.scheduler.client.report [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Deleted allocations for instance 1cb5175a-2ebc-4a24-a337-18818506b843
Oct 02 08:50:07 compute-0 nova_compute[192567]: 2025-10-02 08:50:07.648 2 DEBUG oslo_concurrency.lockutils [None req-480489bc-f176-4595-845a-78091b1332fd 796383ba775b4e63982ab22e1ab7e3e4 ac74327999c44d3eb46d6c8280531147 - - default default] Lock "1cb5175a-2ebc-4a24-a337-18818506b843" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.475s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:08 compute-0 podman[230358]: 2025-10-02 08:50:08.146192183 +0000 UTC m=+0.063831799 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, name=ubi9-minimal)
Oct 02 08:50:08 compute-0 nova_compute[192567]: 2025-10-02 08:50:08.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:50:09 compute-0 nova_compute[192567]: 2025-10-02 08:50:09.403 2 DEBUG nova.compute.manager [req-a0df4589-771e-4242-9ceb-17a7d757fccc req-ae76d270-6c34-4c76-9d89-2a79ab05a631 0118a72dc3454fa389f825c36dbcb57d 353691e3063942de83ad2d05649cd1f3 - - default default] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Received event network-vif-deleted-fabf06ed-2e57-414c-a22a-c1211852e0e0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 02 08:50:09 compute-0 nova_compute[192567]: 2025-10-02 08:50:09.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:09 compute-0 nova_compute[192567]: 2025-10-02 08:50:09.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:50:09 compute-0 nova_compute[192567]: 2025-10-02 08:50:09.627 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:50:10 compute-0 nova_compute[192567]: 2025-10-02 08:50:10.629 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:50:11 compute-0 nova_compute[192567]: 2025-10-02 08:50:11.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:11 compute-0 nova_compute[192567]: 2025-10-02 08:50:11.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:50:14 compute-0 nova_compute[192567]: 2025-10-02 08:50:14.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:16 compute-0 sshd-session[230379]: Invalid user config from 128.185.228.134 port 56654
Oct 02 08:50:16 compute-0 sshd-session[230379]: pam_unix(sshd:auth): check pass; user unknown
Oct 02 08:50:16 compute-0 sshd-session[230379]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=128.185.228.134
Oct 02 08:50:16 compute-0 nova_compute[192567]: 2025-10-02 08:50:16.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:17 compute-0 nova_compute[192567]: 2025-10-02 08:50:17.262 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759395002.2580595, ad53b0ad-a45d-429f-9f25-0da979224b83 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:50:17 compute-0 nova_compute[192567]: 2025-10-02 08:50:17.262 2 INFO nova.compute.manager [-] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] VM Stopped (Lifecycle Event)
Oct 02 08:50:17 compute-0 nova_compute[192567]: 2025-10-02 08:50:17.297 2 DEBUG nova.compute.manager [None req-1cbc52b9-2918-479a-b886-1fea6ff70faa - - - - - -] [instance: ad53b0ad-a45d-429f-9f25-0da979224b83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:50:18 compute-0 sshd-session[230379]: Failed password for invalid user config from 128.185.228.134 port 56654 ssh2
Oct 02 08:50:19 compute-0 nova_compute[192567]: 2025-10-02 08:50:19.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:20 compute-0 sshd-session[230379]: Connection closed by invalid user config 128.185.228.134 port 56654 [preauth]
Oct 02 08:50:21 compute-0 nova_compute[192567]: 2025-10-02 08:50:21.467 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759395006.46549, 1cb5175a-2ebc-4a24-a337-18818506b843 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 02 08:50:21 compute-0 nova_compute[192567]: 2025-10-02 08:50:21.468 2 INFO nova.compute.manager [-] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] VM Stopped (Lifecycle Event)
Oct 02 08:50:21 compute-0 nova_compute[192567]: 2025-10-02 08:50:21.513 2 DEBUG nova.compute.manager [None req-37c42250-fab1-4751-b0ce-748fc0bb7467 - - - - - -] [instance: 1cb5175a-2ebc-4a24-a337-18818506b843] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 02 08:50:21 compute-0 nova_compute[192567]: 2025-10-02 08:50:21.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:23 compute-0 podman[230384]: 2025-10-02 08:50:23.164336125 +0000 UTC m=+0.069971459 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.build-date=20251001)
Oct 02 08:50:23 compute-0 podman[230381]: 2025-10-02 08:50:23.173545821 +0000 UTC m=+0.091833950 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:50:23 compute-0 podman[230383]: 2025-10-02 08:50:23.206871699 +0000 UTC m=+0.102402539 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 02 08:50:23 compute-0 podman[230382]: 2025-10-02 08:50:23.225700865 +0000 UTC m=+0.129714369 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Oct 02 08:50:24 compute-0 nova_compute[192567]: 2025-10-02 08:50:24.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:26 compute-0 nova_compute[192567]: 2025-10-02 08:50:26.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:27 compute-0 nova_compute[192567]: 2025-10-02 08:50:27.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:29 compute-0 podman[230460]: 2025-10-02 08:50:29.186795002 +0000 UTC m=+0.088443864 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:50:29 compute-0 nova_compute[192567]: 2025-10-02 08:50:29.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:29 compute-0 podman[203011]: time="2025-10-02T08:50:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:50:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:50:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:50:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:50:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Oct 02 08:50:31 compute-0 openstack_network_exporter[205118]: ERROR   08:50:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:50:31 compute-0 openstack_network_exporter[205118]: ERROR   08:50:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:50:31 compute-0 openstack_network_exporter[205118]: ERROR   08:50:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:50:31 compute-0 openstack_network_exporter[205118]: ERROR   08:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:50:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:50:31 compute-0 openstack_network_exporter[205118]: ERROR   08:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:50:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:50:31 compute-0 nova_compute[192567]: 2025-10-02 08:50:31.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:34 compute-0 nova_compute[192567]: 2025-10-02 08:50:34.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:36 compute-0 nova_compute[192567]: 2025-10-02 08:50:36.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:39 compute-0 podman[230484]: 2025-10-02 08:50:39.185127672 +0000 UTC m=+0.098181627 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:50:39 compute-0 nova_compute[192567]: 2025-10-02 08:50:39.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:41 compute-0 nova_compute[192567]: 2025-10-02 08:50:41.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:44 compute-0 nova_compute[192567]: 2025-10-02 08:50:44.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:46.016 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:50:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:46.017 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:50:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:50:46.017 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:50:46 compute-0 nova_compute[192567]: 2025-10-02 08:50:46.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:49 compute-0 nova_compute[192567]: 2025-10-02 08:50:49.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:51 compute-0 nova_compute[192567]: 2025-10-02 08:50:51.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:54 compute-0 podman[230508]: 2025-10-02 08:50:54.181044982 +0000 UTC m=+0.074600013 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct 02 08:50:54 compute-0 podman[230505]: 2025-10-02 08:50:54.188423712 +0000 UTC m=+0.093794291 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:50:54 compute-0 podman[230507]: 2025-10-02 08:50:54.20797345 +0000 UTC m=+0.098418285 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, io.buildah.version=1.41.3)
Oct 02 08:50:54 compute-0 podman[230506]: 2025-10-02 08:50:54.250904747 +0000 UTC m=+0.151408615 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 02 08:50:54 compute-0 nova_compute[192567]: 2025-10-02 08:50:54.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:56 compute-0 nova_compute[192567]: 2025-10-02 08:50:56.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:56 compute-0 nova_compute[192567]: 2025-10-02 08:50:56.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:50:56 compute-0 nova_compute[192567]: 2025-10-02 08:50:56.623 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:50:56 compute-0 nova_compute[192567]: 2025-10-02 08:50:56.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:50:56 compute-0 nova_compute[192567]: 2025-10-02 08:50:56.649 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:50:59 compute-0 podman[203011]: time="2025-10-02T08:50:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:50:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:50:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:50:59 compute-0 nova_compute[192567]: 2025-10-02 08:50:59.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:50:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:50:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3016 "" "Go-http-client/1.1"
Oct 02 08:50:59 compute-0 podman[230588]: 2025-10-02 08:50:59.868042316 +0000 UTC m=+0.078782164 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:51:01 compute-0 openstack_network_exporter[205118]: ERROR   08:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:51:01 compute-0 openstack_network_exporter[205118]: ERROR   08:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:51:01 compute-0 openstack_network_exporter[205118]: ERROR   08:51:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:51:01 compute-0 openstack_network_exporter[205118]: ERROR   08:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:51:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:51:01 compute-0 openstack_network_exporter[205118]: ERROR   08:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:51:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.665 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.665 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.666 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.666 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.924 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.925 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5856MB free_disk=73.46154022216797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.926 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.926 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.977 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.977 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:51:01 compute-0 nova_compute[192567]: 2025-10-02 08:51:01.997 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing inventories for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 02 08:51:02 compute-0 nova_compute[192567]: 2025-10-02 08:51:02.024 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating ProviderTree inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 02 08:51:02 compute-0 nova_compute[192567]: 2025-10-02 08:51:02.024 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:51:02 compute-0 nova_compute[192567]: 2025-10-02 08:51:02.043 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing aggregate associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 02 08:51:02 compute-0 nova_compute[192567]: 2025-10-02 08:51:02.063 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing trait associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 02 08:51:02 compute-0 nova_compute[192567]: 2025-10-02 08:51:02.086 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:51:02 compute-0 nova_compute[192567]: 2025-10-02 08:51:02.104 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:51:02 compute-0 nova_compute[192567]: 2025-10-02 08:51:02.132 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:51:02 compute-0 nova_compute[192567]: 2025-10-02 08:51:02.133 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:51:03 compute-0 nova_compute[192567]: 2025-10-02 08:51:03.133 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:51:03 compute-0 nova_compute[192567]: 2025-10-02 08:51:03.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:51:03 compute-0 nova_compute[192567]: 2025-10-02 08:51:03.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:51:04 compute-0 nova_compute[192567]: 2025-10-02 08:51:04.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:06 compute-0 nova_compute[192567]: 2025-10-02 08:51:06.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:08 compute-0 ovn_controller[94821]: 2025-10-02T08:51:08Z|00272|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 02 08:51:08 compute-0 nova_compute[192567]: 2025-10-02 08:51:08.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:51:09 compute-0 nova_compute[192567]: 2025-10-02 08:51:09.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:10 compute-0 podman[230612]: 2025-10-02 08:51:10.183346574 +0000 UTC m=+0.098051293 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, config_id=edpm, build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 02 08:51:11 compute-0 nova_compute[192567]: 2025-10-02 08:51:11.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:11 compute-0 nova_compute[192567]: 2025-10-02 08:51:11.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:51:11 compute-0 nova_compute[192567]: 2025-10-02 08:51:11.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:51:11 compute-0 nova_compute[192567]: 2025-10-02 08:51:11.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:51:14 compute-0 nova_compute[192567]: 2025-10-02 08:51:14.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:16 compute-0 nova_compute[192567]: 2025-10-02 08:51:16.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:19 compute-0 nova_compute[192567]: 2025-10-02 08:51:19.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:21 compute-0 nova_compute[192567]: 2025-10-02 08:51:21.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:24 compute-0 nova_compute[192567]: 2025-10-02 08:51:24.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:25 compute-0 podman[230633]: 2025-10-02 08:51:25.17399601 +0000 UTC m=+0.085357639 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:51:25 compute-0 podman[230640]: 2025-10-02 08:51:25.18299331 +0000 UTC m=+0.077953958 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:51:25 compute-0 podman[230641]: 2025-10-02 08:51:25.205113559 +0000 UTC m=+0.089625081 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid)
Oct 02 08:51:25 compute-0 podman[230634]: 2025-10-02 08:51:25.250313716 +0000 UTC m=+0.150863017 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:51:26 compute-0 nova_compute[192567]: 2025-10-02 08:51:26.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:29 compute-0 sshd-session[230714]: Accepted publickey for zuul from 192.168.122.10 port 43240 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 08:51:29 compute-0 systemd-logind[827]: New session 43 of user zuul.
Oct 02 08:51:29 compute-0 systemd[1]: Started Session 43 of User zuul.
Oct 02 08:51:29 compute-0 sshd-session[230714]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 08:51:29 compute-0 sudo[230718]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Oct 02 08:51:29 compute-0 sudo[230718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:51:29 compute-0 podman[203011]: time="2025-10-02T08:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:51:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:51:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 02 08:51:29 compute-0 nova_compute[192567]: 2025-10-02 08:51:29.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:30 compute-0 podman[230752]: 2025-10-02 08:51:30.489683594 +0000 UTC m=+0.110086589 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:51:31 compute-0 openstack_network_exporter[205118]: ERROR   08:51:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:51:31 compute-0 openstack_network_exporter[205118]: ERROR   08:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:51:31 compute-0 openstack_network_exporter[205118]: ERROR   08:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:51:31 compute-0 openstack_network_exporter[205118]: ERROR   08:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:51:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:51:31 compute-0 openstack_network_exporter[205118]: ERROR   08:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:51:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:51:31 compute-0 nova_compute[192567]: 2025-10-02 08:51:31.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:34 compute-0 ovs-vsctl[230913]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 02 08:51:34 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 230742 (sos)
Oct 02 08:51:34 compute-0 nova_compute[192567]: 2025-10-02 08:51:34.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:34 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct 02 08:51:34 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct 02 08:51:35 compute-0 virtqemud[192112]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 02 08:51:35 compute-0 virtqemud[192112]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 02 08:51:35 compute-0 virtqemud[192112]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 02 08:51:36 compute-0 kernel: block sr0: the capability attribute has been deprecated.
Oct 02 08:51:36 compute-0 crontab[231341]: (root) LIST (root)
Oct 02 08:51:36 compute-0 nova_compute[192567]: 2025-10-02 08:51:36.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:38 compute-0 systemd[1]: Starting Hostname Service...
Oct 02 08:51:38 compute-0 systemd[1]: Started Hostname Service.
Oct 02 08:51:39 compute-0 nova_compute[192567]: 2025-10-02 08:51:39.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:40 compute-0 podman[231525]: 2025-10-02 08:51:40.42854285 +0000 UTC m=+0.077471361 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, version=9.6)
Oct 02 08:51:41 compute-0 nova_compute[192567]: 2025-10-02 08:51:41.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:44 compute-0 nova_compute[192567]: 2025-10-02 08:51:44.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:45 compute-0 ovs-appctl[232479]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 02 08:51:45 compute-0 ovs-appctl[232485]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 02 08:51:45 compute-0 ovs-appctl[232494]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 02 08:51:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:51:46.019 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:51:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:51:46.019 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:51:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:51:46.020 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:51:46 compute-0 nova_compute[192567]: 2025-10-02 08:51:46.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:49 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 02 08:51:49 compute-0 nova_compute[192567]: 2025-10-02 08:51:49.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:51 compute-0 nova_compute[192567]: 2025-10-02 08:51:51.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:53 compute-0 virtqemud[192112]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 02 08:51:54 compute-0 nova_compute[192567]: 2025-10-02 08:51:54.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:55 compute-0 podman[233867]: 2025-10-02 08:51:55.323999094 +0000 UTC m=+0.103970255 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 02 08:51:55 compute-0 podman[233869]: 2025-10-02 08:51:55.333009274 +0000 UTC m=+0.108930979 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 02 08:51:55 compute-0 podman[233876]: 2025-10-02 08:51:55.335036457 +0000 UTC m=+0.100187887 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true)
Oct 02 08:51:55 compute-0 podman[233905]: 2025-10-02 08:51:55.39234109 +0000 UTC m=+0.102491119 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Oct 02 08:51:56 compute-0 systemd[1]: Starting Time & Date Service...
Oct 02 08:51:56 compute-0 systemd[1]: Started Time & Date Service.
Oct 02 08:51:56 compute-0 nova_compute[192567]: 2025-10-02 08:51:56.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:51:57 compute-0 nova_compute[192567]: 2025-10-02 08:51:57.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:51:57 compute-0 nova_compute[192567]: 2025-10-02 08:51:57.626 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:51:57 compute-0 nova_compute[192567]: 2025-10-02 08:51:57.626 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:51:57 compute-0 nova_compute[192567]: 2025-10-02 08:51:57.654 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:51:59 compute-0 podman[203011]: time="2025-10-02T08:51:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:51:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:51:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:51:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:51:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3016 "" "Go-http-client/1.1"
Oct 02 08:51:59 compute-0 nova_compute[192567]: 2025-10-02 08:51:59.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:00 compute-0 podman[233986]: 2025-10-02 08:52:00.610330766 +0000 UTC m=+0.071645140 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:52:01 compute-0 openstack_network_exporter[205118]: ERROR   08:52:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:52:01 compute-0 openstack_network_exporter[205118]: ERROR   08:52:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:52:01 compute-0 openstack_network_exporter[205118]: ERROR   08:52:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:52:01 compute-0 openstack_network_exporter[205118]: ERROR   08:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:52:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:52:01 compute-0 openstack_network_exporter[205118]: ERROR   08:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:52:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:52:01 compute-0 nova_compute[192567]: 2025-10-02 08:52:01.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:02 compute-0 nova_compute[192567]: 2025-10-02 08:52:02.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:02 compute-0 nova_compute[192567]: 2025-10-02 08:52:02.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 02 08:52:02 compute-0 nova_compute[192567]: 2025-10-02 08:52:02.647 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 02 08:52:03 compute-0 nova_compute[192567]: 2025-10-02 08:52:03.642 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:03 compute-0 nova_compute[192567]: 2025-10-02 08:52:03.644 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:03 compute-0 nova_compute[192567]: 2025-10-02 08:52:03.645 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:03 compute-0 nova_compute[192567]: 2025-10-02 08:52:03.725 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:52:03 compute-0 nova_compute[192567]: 2025-10-02 08:52:03.726 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:52:03 compute-0 nova_compute[192567]: 2025-10-02 08:52:03.726 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:52:03 compute-0 nova_compute[192567]: 2025-10-02 08:52:03.726 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:52:03 compute-0 nova_compute[192567]: 2025-10-02 08:52:03.939 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:52:03 compute-0 nova_compute[192567]: 2025-10-02 08:52:03.940 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5511MB free_disk=72.95986938476562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:52:03 compute-0 nova_compute[192567]: 2025-10-02 08:52:03.941 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:52:03 compute-0 nova_compute[192567]: 2025-10-02 08:52:03.941 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:52:04 compute-0 nova_compute[192567]: 2025-10-02 08:52:04.060 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:52:04 compute-0 nova_compute[192567]: 2025-10-02 08:52:04.060 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:52:04 compute-0 nova_compute[192567]: 2025-10-02 08:52:04.078 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:52:04 compute-0 nova_compute[192567]: 2025-10-02 08:52:04.093 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:52:04 compute-0 nova_compute[192567]: 2025-10-02 08:52:04.110 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:52:04 compute-0 nova_compute[192567]: 2025-10-02 08:52:04.110 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:52:04 compute-0 nova_compute[192567]: 2025-10-02 08:52:04.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:05 compute-0 nova_compute[192567]: 2025-10-02 08:52:05.088 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:05 compute-0 nova_compute[192567]: 2025-10-02 08:52:05.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:06 compute-0 nova_compute[192567]: 2025-10-02 08:52:06.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:08 compute-0 nova_compute[192567]: 2025-10-02 08:52:08.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:09 compute-0 nova_compute[192567]: 2025-10-02 08:52:09.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:11 compute-0 podman[234014]: 2025-10-02 08:52:11.142667165 +0000 UTC m=+0.071320040 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Oct 02 08:52:11 compute-0 nova_compute[192567]: 2025-10-02 08:52:11.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:11 compute-0 nova_compute[192567]: 2025-10-02 08:52:11.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:12 compute-0 nova_compute[192567]: 2025-10-02 08:52:12.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:13 compute-0 nova_compute[192567]: 2025-10-02 08:52:13.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:13 compute-0 nova_compute[192567]: 2025-10-02 08:52:13.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:52:13 compute-0 nova_compute[192567]: 2025-10-02 08:52:13.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:13 compute-0 nova_compute[192567]: 2025-10-02 08:52:13.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 02 08:52:14 compute-0 nova_compute[192567]: 2025-10-02 08:52:14.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:16 compute-0 sudo[230718]: pam_unix(sudo:session): session closed for user root
Oct 02 08:52:16 compute-0 sshd-session[230717]: Received disconnect from 192.168.122.10 port 43240:11: disconnected by user
Oct 02 08:52:16 compute-0 sshd-session[230717]: Disconnected from user zuul 192.168.122.10 port 43240
Oct 02 08:52:16 compute-0 sshd-session[230714]: pam_unix(sshd:session): session closed for user zuul
Oct 02 08:52:16 compute-0 systemd[1]: session-43.scope: Deactivated successfully.
Oct 02 08:52:16 compute-0 systemd[1]: session-43.scope: Consumed 1min 19.826s CPU time, 501.5M memory peak, read 106.6M from disk, written 17.8M to disk.
Oct 02 08:52:16 compute-0 systemd-logind[827]: Session 43 logged out. Waiting for processes to exit.
Oct 02 08:52:16 compute-0 systemd-logind[827]: Removed session 43.
Oct 02 08:52:16 compute-0 sshd-session[234035]: Accepted publickey for zuul from 192.168.122.10 port 44230 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 08:52:16 compute-0 systemd-logind[827]: New session 44 of user zuul.
Oct 02 08:52:16 compute-0 systemd[1]: Started Session 44 of User zuul.
Oct 02 08:52:16 compute-0 sshd-session[234035]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 08:52:16 compute-0 sudo[234039]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-10-02-xeydfni.tar.xz
Oct 02 08:52:16 compute-0 sudo[234039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:52:16 compute-0 sudo[234039]: pam_unix(sudo:session): session closed for user root
Oct 02 08:52:16 compute-0 sshd-session[234038]: Received disconnect from 192.168.122.10 port 44230:11: disconnected by user
Oct 02 08:52:16 compute-0 sshd-session[234038]: Disconnected from user zuul 192.168.122.10 port 44230
Oct 02 08:52:16 compute-0 sshd-session[234035]: pam_unix(sshd:session): session closed for user zuul
Oct 02 08:52:16 compute-0 systemd-logind[827]: Session 44 logged out. Waiting for processes to exit.
Oct 02 08:52:16 compute-0 systemd[1]: session-44.scope: Deactivated successfully.
Oct 02 08:52:16 compute-0 systemd-logind[827]: Removed session 44.
Oct 02 08:52:16 compute-0 sshd-session[234064]: Accepted publickey for zuul from 192.168.122.10 port 44240 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 08:52:16 compute-0 systemd-logind[827]: New session 45 of user zuul.
Oct 02 08:52:16 compute-0 systemd[1]: Started Session 45 of User zuul.
Oct 02 08:52:16 compute-0 sshd-session[234064]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 08:52:16 compute-0 nova_compute[192567]: 2025-10-02 08:52:16.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:16 compute-0 sudo[234068]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Oct 02 08:52:16 compute-0 sudo[234068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:52:16 compute-0 sudo[234068]: pam_unix(sudo:session): session closed for user root
Oct 02 08:52:16 compute-0 sshd-session[234067]: Received disconnect from 192.168.122.10 port 44240:11: disconnected by user
Oct 02 08:52:16 compute-0 sshd-session[234067]: Disconnected from user zuul 192.168.122.10 port 44240
Oct 02 08:52:16 compute-0 sshd-session[234064]: pam_unix(sshd:session): session closed for user zuul
Oct 02 08:52:16 compute-0 systemd[1]: session-45.scope: Deactivated successfully.
Oct 02 08:52:16 compute-0 systemd-logind[827]: Session 45 logged out. Waiting for processes to exit.
Oct 02 08:52:16 compute-0 systemd-logind[827]: Removed session 45.
Oct 02 08:52:19 compute-0 nova_compute[192567]: 2025-10-02 08:52:19.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:21 compute-0 nova_compute[192567]: 2025-10-02 08:52:21.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:24 compute-0 nova_compute[192567]: 2025-10-02 08:52:24.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:26 compute-0 podman[234093]: 2025-10-02 08:52:26.190873159 +0000 UTC m=+0.094845602 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 02 08:52:26 compute-0 podman[234096]: 2025-10-02 08:52:26.203682877 +0000 UTC m=+0.099116915 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 02 08:52:26 compute-0 podman[234095]: 2025-10-02 08:52:26.201990004 +0000 UTC m=+0.101345603 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:52:26 compute-0 podman[234094]: 2025-10-02 08:52:26.21312266 +0000 UTC m=+0.124053369 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, org.label-schema.build-date=20251001)
Oct 02 08:52:26 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 02 08:52:26 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 02 08:52:26 compute-0 nova_compute[192567]: 2025-10-02 08:52:26.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:29 compute-0 podman[203011]: time="2025-10-02T08:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:52:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:52:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Oct 02 08:52:29 compute-0 nova_compute[192567]: 2025-10-02 08:52:29.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:31 compute-0 podman[234178]: 2025-10-02 08:52:31.195854708 +0000 UTC m=+0.098602189 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:52:31 compute-0 openstack_network_exporter[205118]: ERROR   08:52:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:52:31 compute-0 openstack_network_exporter[205118]: ERROR   08:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:52:31 compute-0 openstack_network_exporter[205118]: ERROR   08:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:52:31 compute-0 openstack_network_exporter[205118]: ERROR   08:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:52:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:52:31 compute-0 openstack_network_exporter[205118]: ERROR   08:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:52:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:52:31 compute-0 nova_compute[192567]: 2025-10-02 08:52:31.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:34 compute-0 nova_compute[192567]: 2025-10-02 08:52:34.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:36 compute-0 nova_compute[192567]: 2025-10-02 08:52:36.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:36 compute-0 nova_compute[192567]: 2025-10-02 08:52:36.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:39 compute-0 nova_compute[192567]: 2025-10-02 08:52:39.941 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:39 compute-0 nova_compute[192567]: 2025-10-02 08:52:39.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:41 compute-0 nova_compute[192567]: 2025-10-02 08:52:41.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:42 compute-0 podman[234202]: 2025-10-02 08:52:42.198867786 +0000 UTC m=+0.100071434 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7)
Oct 02 08:52:44 compute-0 nova_compute[192567]: 2025-10-02 08:52:44.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:52:46.020 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:52:46.021 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:52:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:52:46.021 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:52:46 compute-0 nova_compute[192567]: 2025-10-02 08:52:46.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:49 compute-0 nova_compute[192567]: 2025-10-02 08:52:49.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:51 compute-0 nova_compute[192567]: 2025-10-02 08:52:51.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:54 compute-0 nova_compute[192567]: 2025-10-02 08:52:54.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:56 compute-0 nova_compute[192567]: 2025-10-02 08:52:56.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:52:57 compute-0 podman[234225]: 2025-10-02 08:52:57.149586726 +0000 UTC m=+0.061722141 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 02 08:52:57 compute-0 podman[234226]: 2025-10-02 08:52:57.171517458 +0000 UTC m=+0.076096038 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3)
Oct 02 08:52:57 compute-0 podman[234223]: 2025-10-02 08:52:57.178630129 +0000 UTC m=+0.095758710 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 02 08:52:57 compute-0 podman[234224]: 2025-10-02 08:52:57.215860987 +0000 UTC m=+0.120362025 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:52:59 compute-0 nova_compute[192567]: 2025-10-02 08:52:59.653 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:52:59 compute-0 nova_compute[192567]: 2025-10-02 08:52:59.653 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:52:59 compute-0 nova_compute[192567]: 2025-10-02 08:52:59.653 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:52:59 compute-0 nova_compute[192567]: 2025-10-02 08:52:59.675 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:52:59 compute-0 podman[203011]: time="2025-10-02T08:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:52:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:52:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3013 "" "Go-http-client/1.1"
Oct 02 08:52:59 compute-0 nova_compute[192567]: 2025-10-02 08:52:59.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:01 compute-0 openstack_network_exporter[205118]: ERROR   08:53:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:53:01 compute-0 openstack_network_exporter[205118]: ERROR   08:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:53:01 compute-0 openstack_network_exporter[205118]: ERROR   08:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:53:01 compute-0 openstack_network_exporter[205118]: ERROR   08:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:53:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:53:01 compute-0 openstack_network_exporter[205118]: ERROR   08:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:53:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:53:01 compute-0 nova_compute[192567]: 2025-10-02 08:53:01.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:02 compute-0 podman[234308]: 2025-10-02 08:53:02.177448547 +0000 UTC m=+0.082106285 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.649 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.650 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.650 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.650 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.825 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.827 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5816MB free_disk=73.46125793457031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.827 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.827 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.897 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.897 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.949 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.964 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.993 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:53:03 compute-0 nova_compute[192567]: 2025-10-02 08:53:03.994 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:53:04 compute-0 nova_compute[192567]: 2025-10-02 08:53:04.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:04 compute-0 nova_compute[192567]: 2025-10-02 08:53:04.994 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:53:04 compute-0 nova_compute[192567]: 2025-10-02 08:53:04.994 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:53:05 compute-0 nova_compute[192567]: 2025-10-02 08:53:05.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:53:06 compute-0 nova_compute[192567]: 2025-10-02 08:53:06.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:53:06 compute-0 nova_compute[192567]: 2025-10-02 08:53:06.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:09 compute-0 nova_compute[192567]: 2025-10-02 08:53:09.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:10 compute-0 nova_compute[192567]: 2025-10-02 08:53:10.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:53:11 compute-0 nova_compute[192567]: 2025-10-02 08:53:11.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:13 compute-0 podman[234332]: 2025-10-02 08:53:13.150497664 +0000 UTC m=+0.067669206 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:53:13 compute-0 nova_compute[192567]: 2025-10-02 08:53:13.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:53:13 compute-0 nova_compute[192567]: 2025-10-02 08:53:13.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:53:13 compute-0 nova_compute[192567]: 2025-10-02 08:53:13.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:53:14 compute-0 nova_compute[192567]: 2025-10-02 08:53:14.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:16 compute-0 nova_compute[192567]: 2025-10-02 08:53:16.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:19 compute-0 nova_compute[192567]: 2025-10-02 08:53:19.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:21 compute-0 nova_compute[192567]: 2025-10-02 08:53:21.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:24 compute-0 nova_compute[192567]: 2025-10-02 08:53:24.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:26 compute-0 nova_compute[192567]: 2025-10-02 08:53:26.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:28 compute-0 podman[234358]: 2025-10-02 08:53:28.164767974 +0000 UTC m=+0.074887541 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 02 08:53:28 compute-0 podman[234355]: 2025-10-02 08:53:28.17300621 +0000 UTC m=+0.083210339 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Oct 02 08:53:28 compute-0 podman[234357]: 2025-10-02 08:53:28.190025719 +0000 UTC m=+0.093837119 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct 02 08:53:28 compute-0 podman[234356]: 2025-10-02 08:53:28.254458923 +0000 UTC m=+0.162115843 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:53:29 compute-0 podman[203011]: time="2025-10-02T08:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:53:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:53:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3013 "" "Go-http-client/1.1"
Oct 02 08:53:29 compute-0 nova_compute[192567]: 2025-10-02 08:53:29.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:31 compute-0 openstack_network_exporter[205118]: ERROR   08:53:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:53:31 compute-0 openstack_network_exporter[205118]: ERROR   08:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:53:31 compute-0 openstack_network_exporter[205118]: ERROR   08:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:53:31 compute-0 openstack_network_exporter[205118]: ERROR   08:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:53:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:53:31 compute-0 openstack_network_exporter[205118]: ERROR   08:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:53:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:53:31 compute-0 nova_compute[192567]: 2025-10-02 08:53:31.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:33 compute-0 podman[234436]: 2025-10-02 08:53:33.166903974 +0000 UTC m=+0.078084369 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:53:34 compute-0 nova_compute[192567]: 2025-10-02 08:53:34.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:36 compute-0 nova_compute[192567]: 2025-10-02 08:53:36.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:40 compute-0 nova_compute[192567]: 2025-10-02 08:53:40.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:41 compute-0 nova_compute[192567]: 2025-10-02 08:53:41.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:44 compute-0 podman[234461]: 2025-10-02 08:53:44.171560645 +0000 UTC m=+0.085105478 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 02 08:53:45 compute-0 nova_compute[192567]: 2025-10-02 08:53:45.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:53:46.022 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:53:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:53:46.022 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:53:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:53:46.022 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:53:46 compute-0 nova_compute[192567]: 2025-10-02 08:53:46.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:50 compute-0 nova_compute[192567]: 2025-10-02 08:53:50.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:51 compute-0 nova_compute[192567]: 2025-10-02 08:53:51.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:55 compute-0 nova_compute[192567]: 2025-10-02 08:53:55.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:56 compute-0 nova_compute[192567]: 2025-10-02 08:53:56.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:53:59 compute-0 podman[234485]: 2025-10-02 08:53:59.165342887 +0000 UTC m=+0.064989892 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:53:59 compute-0 podman[234482]: 2025-10-02 08:53:59.175014888 +0000 UTC m=+0.085977215 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:53:59 compute-0 podman[234484]: 2025-10-02 08:53:59.190357566 +0000 UTC m=+0.091891160 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 02 08:53:59 compute-0 podman[234483]: 2025-10-02 08:53:59.210867573 +0000 UTC m=+0.106835564 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Oct 02 08:53:59 compute-0 podman[203011]: time="2025-10-02T08:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:53:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:53:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Oct 02 08:54:00 compute-0 nova_compute[192567]: 2025-10-02 08:54:00.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:00 compute-0 nova_compute[192567]: 2025-10-02 08:54:00.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:54:00 compute-0 nova_compute[192567]: 2025-10-02 08:54:00.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:54:00 compute-0 nova_compute[192567]: 2025-10-02 08:54:00.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:54:00 compute-0 nova_compute[192567]: 2025-10-02 08:54:00.644 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:54:01 compute-0 openstack_network_exporter[205118]: ERROR   08:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:54:01 compute-0 openstack_network_exporter[205118]: ERROR   08:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:54:01 compute-0 openstack_network_exporter[205118]: ERROR   08:54:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:54:01 compute-0 openstack_network_exporter[205118]: ERROR   08:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:54:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:54:01 compute-0 openstack_network_exporter[205118]: ERROR   08:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:54:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:54:01 compute-0 nova_compute[192567]: 2025-10-02 08:54:01.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:04 compute-0 podman[234562]: 2025-10-02 08:54:04.157139667 +0000 UTC m=+0.076572082 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 02 08:54:04 compute-0 nova_compute[192567]: 2025-10-02 08:54:04.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:54:04 compute-0 nova_compute[192567]: 2025-10-02 08:54:04.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:54:04 compute-0 nova_compute[192567]: 2025-10-02 08:54:04.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:54:04 compute-0 nova_compute[192567]: 2025-10-02 08:54:04.648 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:54:04 compute-0 nova_compute[192567]: 2025-10-02 08:54:04.649 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:54:04 compute-0 nova_compute[192567]: 2025-10-02 08:54:04.650 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:54:04 compute-0 nova_compute[192567]: 2025-10-02 08:54:04.650 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:54:04 compute-0 nova_compute[192567]: 2025-10-02 08:54:04.888 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:54:04 compute-0 nova_compute[192567]: 2025-10-02 08:54:04.890 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5828MB free_disk=73.46125793457031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:54:04 compute-0 nova_compute[192567]: 2025-10-02 08:54:04.890 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:54:04 compute-0 nova_compute[192567]: 2025-10-02 08:54:04.890 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:54:05 compute-0 nova_compute[192567]: 2025-10-02 08:54:05.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:05 compute-0 nova_compute[192567]: 2025-10-02 08:54:05.109 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:54:05 compute-0 nova_compute[192567]: 2025-10-02 08:54:05.109 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:54:05 compute-0 nova_compute[192567]: 2025-10-02 08:54:05.195 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:54:05 compute-0 nova_compute[192567]: 2025-10-02 08:54:05.212 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:54:05 compute-0 nova_compute[192567]: 2025-10-02 08:54:05.214 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:54:05 compute-0 nova_compute[192567]: 2025-10-02 08:54:05.214 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:54:06 compute-0 nova_compute[192567]: 2025-10-02 08:54:06.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:07 compute-0 nova_compute[192567]: 2025-10-02 08:54:07.213 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:54:07 compute-0 nova_compute[192567]: 2025-10-02 08:54:07.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:54:10 compute-0 nova_compute[192567]: 2025-10-02 08:54:10.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:10 compute-0 nova_compute[192567]: 2025-10-02 08:54:10.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:54:11 compute-0 nova_compute[192567]: 2025-10-02 08:54:11.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:13 compute-0 nova_compute[192567]: 2025-10-02 08:54:13.619 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:54:15 compute-0 unix_chkpwd[234588]: password check failed for user (root)
Oct 02 08:54:15 compute-0 sshd-session[234586]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 02 08:54:15 compute-0 nova_compute[192567]: 2025-10-02 08:54:15.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:15 compute-0 podman[234589]: 2025-10-02 08:54:15.18279389 +0000 UTC m=+0.095383698 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Oct 02 08:54:15 compute-0 nova_compute[192567]: 2025-10-02 08:54:15.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:54:15 compute-0 nova_compute[192567]: 2025-10-02 08:54:15.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:54:15 compute-0 nova_compute[192567]: 2025-10-02 08:54:15.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:54:16 compute-0 sshd-session[234586]: Failed password for root from 80.94.93.119 port 60588 ssh2
Oct 02 08:54:16 compute-0 nova_compute[192567]: 2025-10-02 08:54:16.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:17 compute-0 unix_chkpwd[234610]: password check failed for user (root)
Oct 02 08:54:19 compute-0 sshd-session[234586]: Failed password for root from 80.94.93.119 port 60588 ssh2
Oct 02 08:54:19 compute-0 unix_chkpwd[234611]: password check failed for user (root)
Oct 02 08:54:20 compute-0 nova_compute[192567]: 2025-10-02 08:54:20.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:21 compute-0 sshd-session[234586]: Failed password for root from 80.94.93.119 port 60588 ssh2
Oct 02 08:54:21 compute-0 sshd-session[234586]: Received disconnect from 80.94.93.119 port 60588:11:  [preauth]
Oct 02 08:54:21 compute-0 sshd-session[234586]: Disconnected from authenticating user root 80.94.93.119 port 60588 [preauth]
Oct 02 08:54:21 compute-0 sshd-session[234586]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 02 08:54:21 compute-0 nova_compute[192567]: 2025-10-02 08:54:21.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:22 compute-0 unix_chkpwd[234614]: password check failed for user (root)
Oct 02 08:54:22 compute-0 sshd-session[234612]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 02 08:54:24 compute-0 sshd-session[234612]: Failed password for root from 80.94.93.119 port 60616 ssh2
Oct 02 08:54:24 compute-0 unix_chkpwd[234615]: password check failed for user (root)
Oct 02 08:54:25 compute-0 nova_compute[192567]: 2025-10-02 08:54:25.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:26 compute-0 sshd-session[234612]: Failed password for root from 80.94.93.119 port 60616 ssh2
Oct 02 08:54:26 compute-0 nova_compute[192567]: 2025-10-02 08:54:26.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:28 compute-0 unix_chkpwd[234616]: password check failed for user (root)
Oct 02 08:54:29 compute-0 podman[203011]: time="2025-10-02T08:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:54:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:54:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 02 08:54:30 compute-0 nova_compute[192567]: 2025-10-02 08:54:30.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:30 compute-0 podman[234617]: 2025-10-02 08:54:30.183369704 +0000 UTC m=+0.097981048 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:54:30 compute-0 podman[234625]: 2025-10-02 08:54:30.198476535 +0000 UTC m=+0.086216694 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, container_name=iscsid)
Oct 02 08:54:30 compute-0 podman[234619]: 2025-10-02 08:54:30.209719593 +0000 UTC m=+0.101448796 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct 02 08:54:30 compute-0 podman[234618]: 2025-10-02 08:54:30.227572289 +0000 UTC m=+0.127206888 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 02 08:54:30 compute-0 sshd-session[234612]: Failed password for root from 80.94.93.119 port 60616 ssh2
Oct 02 08:54:30 compute-0 sshd-session[234612]: Received disconnect from 80.94.93.119 port 60616:11:  [preauth]
Oct 02 08:54:30 compute-0 sshd-session[234612]: Disconnected from authenticating user root 80.94.93.119 port 60616 [preauth]
Oct 02 08:54:30 compute-0 sshd-session[234612]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 02 08:54:31 compute-0 openstack_network_exporter[205118]: ERROR   08:54:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:54:31 compute-0 openstack_network_exporter[205118]: ERROR   08:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:54:31 compute-0 openstack_network_exporter[205118]: ERROR   08:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:54:31 compute-0 openstack_network_exporter[205118]: ERROR   08:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:54:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:54:31 compute-0 openstack_network_exporter[205118]: ERROR   08:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:54:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:54:31 compute-0 unix_chkpwd[234696]: password check failed for user (root)
Oct 02 08:54:31 compute-0 sshd-session[234694]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 02 08:54:31 compute-0 nova_compute[192567]: 2025-10-02 08:54:31.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:33 compute-0 sshd-session[234694]: Failed password for root from 80.94.93.119 port 45536 ssh2
Oct 02 08:54:33 compute-0 unix_chkpwd[234697]: password check failed for user (root)
Oct 02 08:54:35 compute-0 nova_compute[192567]: 2025-10-02 08:54:35.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:35 compute-0 podman[234698]: 2025-10-02 08:54:35.148277247 +0000 UTC m=+0.063504736 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:54:35 compute-0 sshd-session[234694]: Failed password for root from 80.94.93.119 port 45536 ssh2
Oct 02 08:54:35 compute-0 unix_chkpwd[234721]: password check failed for user (root)
Oct 02 08:54:36 compute-0 nova_compute[192567]: 2025-10-02 08:54:36.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:37 compute-0 sshd-session[234694]: Failed password for root from 80.94.93.119 port 45536 ssh2
Oct 02 08:54:38 compute-0 sshd-session[234694]: Received disconnect from 80.94.93.119 port 45536:11:  [preauth]
Oct 02 08:54:38 compute-0 sshd-session[234694]: Disconnected from authenticating user root 80.94.93.119 port 45536 [preauth]
Oct 02 08:54:38 compute-0 sshd-session[234694]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 02 08:54:40 compute-0 nova_compute[192567]: 2025-10-02 08:54:40.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:41 compute-0 nova_compute[192567]: 2025-10-02 08:54:41.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:45 compute-0 nova_compute[192567]: 2025-10-02 08:54:45.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:54:46.024 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:54:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:54:46.024 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:54:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:54:46.025 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:54:46 compute-0 podman[234722]: 2025-10-02 08:54:46.163734793 +0000 UTC m=+0.077747559 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 02 08:54:46 compute-0 nova_compute[192567]: 2025-10-02 08:54:46.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:50 compute-0 nova_compute[192567]: 2025-10-02 08:54:50.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:51 compute-0 nova_compute[192567]: 2025-10-02 08:54:51.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:55 compute-0 nova_compute[192567]: 2025-10-02 08:54:55.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:56 compute-0 nova_compute[192567]: 2025-10-02 08:54:56.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:54:59 compute-0 podman[203011]: time="2025-10-02T08:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:54:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:54:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 02 08:55:00 compute-0 nova_compute[192567]: 2025-10-02 08:55:00.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:00 compute-0 nova_compute[192567]: 2025-10-02 08:55:00.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:55:00 compute-0 nova_compute[192567]: 2025-10-02 08:55:00.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:55:00 compute-0 nova_compute[192567]: 2025-10-02 08:55:00.626 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:55:00 compute-0 nova_compute[192567]: 2025-10-02 08:55:00.645 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:55:01 compute-0 podman[234743]: 2025-10-02 08:55:01.185443554 +0000 UTC m=+0.087730051 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 02 08:55:01 compute-0 podman[234745]: 2025-10-02 08:55:01.190732248 +0000 UTC m=+0.090672701 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:55:01 compute-0 podman[234746]: 2025-10-02 08:55:01.208508451 +0000 UTC m=+0.106884656 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Oct 02 08:55:01 compute-0 podman[234744]: 2025-10-02 08:55:01.230533056 +0000 UTC m=+0.128867339 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller)
Oct 02 08:55:01 compute-0 openstack_network_exporter[205118]: ERROR   08:55:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:55:01 compute-0 openstack_network_exporter[205118]: ERROR   08:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:55:01 compute-0 openstack_network_exporter[205118]: ERROR   08:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:55:01 compute-0 openstack_network_exporter[205118]: ERROR   08:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:55:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:55:01 compute-0 openstack_network_exporter[205118]: ERROR   08:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:55:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:55:01 compute-0 nova_compute[192567]: 2025-10-02 08:55:01.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.653 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.654 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.654 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.655 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.865 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.866 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5829MB free_disk=73.46123504638672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.866 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.867 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.931 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.931 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.962 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.976 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.978 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:55:04 compute-0 nova_compute[192567]: 2025-10-02 08:55:04.978 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:55:05 compute-0 nova_compute[192567]: 2025-10-02 08:55:05.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:05 compute-0 nova_compute[192567]: 2025-10-02 08:55:05.978 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:55:06 compute-0 podman[234826]: 2025-10-02 08:55:06.14920049 +0000 UTC m=+0.067718436 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:55:06 compute-0 nova_compute[192567]: 2025-10-02 08:55:06.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:55:06 compute-0 nova_compute[192567]: 2025-10-02 08:55:06.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:08 compute-0 nova_compute[192567]: 2025-10-02 08:55:08.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:55:10 compute-0 nova_compute[192567]: 2025-10-02 08:55:10.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:10 compute-0 nova_compute[192567]: 2025-10-02 08:55:10.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:55:11 compute-0 nova_compute[192567]: 2025-10-02 08:55:11.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:15 compute-0 nova_compute[192567]: 2025-10-02 08:55:15.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:15 compute-0 nova_compute[192567]: 2025-10-02 08:55:15.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:55:16 compute-0 nova_compute[192567]: 2025-10-02 08:55:16.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:17 compute-0 podman[234852]: 2025-10-02 08:55:17.171441288 +0000 UTC m=+0.083617572 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64)
Oct 02 08:55:17 compute-0 nova_compute[192567]: 2025-10-02 08:55:17.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:55:17 compute-0 nova_compute[192567]: 2025-10-02 08:55:17.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:55:20 compute-0 nova_compute[192567]: 2025-10-02 08:55:20.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:21 compute-0 nova_compute[192567]: 2025-10-02 08:55:21.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:25 compute-0 nova_compute[192567]: 2025-10-02 08:55:25.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:26 compute-0 nova_compute[192567]: 2025-10-02 08:55:26.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:29 compute-0 podman[203011]: time="2025-10-02T08:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:55:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:55:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3022 "" "Go-http-client/1.1"
Oct 02 08:55:30 compute-0 nova_compute[192567]: 2025-10-02 08:55:30.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:31 compute-0 openstack_network_exporter[205118]: ERROR   08:55:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:55:31 compute-0 openstack_network_exporter[205118]: ERROR   08:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:55:31 compute-0 openstack_network_exporter[205118]: ERROR   08:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:55:31 compute-0 openstack_network_exporter[205118]: ERROR   08:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:55:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:55:31 compute-0 openstack_network_exporter[205118]: ERROR   08:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:55:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:55:31 compute-0 nova_compute[192567]: 2025-10-02 08:55:31.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:32 compute-0 podman[234874]: 2025-10-02 08:55:32.205612948 +0000 UTC m=+0.091241329 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 02 08:55:32 compute-0 podman[234877]: 2025-10-02 08:55:32.219046995 +0000 UTC m=+0.088712930 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 02 08:55:32 compute-0 podman[234876]: 2025-10-02 08:55:32.219067556 +0000 UTC m=+0.093463178 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 02 08:55:32 compute-0 podman[234875]: 2025-10-02 08:55:32.250916478 +0000 UTC m=+0.135245659 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct 02 08:55:35 compute-0 nova_compute[192567]: 2025-10-02 08:55:35.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:36 compute-0 nova_compute[192567]: 2025-10-02 08:55:36.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:37 compute-0 podman[234954]: 2025-10-02 08:55:37.213947463 +0000 UTC m=+0.108031431 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 02 08:55:40 compute-0 nova_compute[192567]: 2025-10-02 08:55:40.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:41 compute-0 nova_compute[192567]: 2025-10-02 08:55:41.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:45 compute-0 nova_compute[192567]: 2025-10-02 08:55:45.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:55:46.025 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:55:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:55:46.025 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:55:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:55:46.025 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:55:46 compute-0 nova_compute[192567]: 2025-10-02 08:55:46.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:48 compute-0 podman[234978]: 2025-10-02 08:55:48.143570907 +0000 UTC m=+0.061447592 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm)
Oct 02 08:55:50 compute-0 nova_compute[192567]: 2025-10-02 08:55:50.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:51 compute-0 nova_compute[192567]: 2025-10-02 08:55:51.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:52 compute-0 nova_compute[192567]: 2025-10-02 08:55:52.369 2 DEBUG oslo_concurrency.processutils [None req-6d5d9064-311b-4fb8-b570-219a4f794aaf 06fd0ba32e344f06ac22f27398df6fab a46cbd7217a541c58391886cae342f44 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 02 08:55:52 compute-0 nova_compute[192567]: 2025-10-02 08:55:52.394 2 DEBUG oslo_concurrency.processutils [None req-6d5d9064-311b-4fb8-b570-219a4f794aaf 06fd0ba32e344f06ac22f27398df6fab a46cbd7217a541c58391886cae342f44 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 02 08:55:55 compute-0 nova_compute[192567]: 2025-10-02 08:55:55.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:56 compute-0 nova_compute[192567]: 2025-10-02 08:55:56.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:57 compute-0 nova_compute[192567]: 2025-10-02 08:55:57.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:55:57 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:55:57.177 103703 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'b2:85:aa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '4e:ef:d5:b3:33:42'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 02 08:55:57 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:55:57.178 103703 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 02 08:55:59 compute-0 podman[203011]: time="2025-10-02T08:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:55:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:55:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3021 "" "Go-http-client/1.1"
Oct 02 08:56:00 compute-0 nova_compute[192567]: 2025-10-02 08:56:00.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:00 compute-0 nova_compute[192567]: 2025-10-02 08:56:00.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:56:00 compute-0 nova_compute[192567]: 2025-10-02 08:56:00.626 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:56:00 compute-0 nova_compute[192567]: 2025-10-02 08:56:00.626 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:56:00 compute-0 nova_compute[192567]: 2025-10-02 08:56:00.649 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:56:01 compute-0 openstack_network_exporter[205118]: ERROR   08:56:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:56:01 compute-0 openstack_network_exporter[205118]: ERROR   08:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:56:01 compute-0 openstack_network_exporter[205118]: ERROR   08:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:56:01 compute-0 openstack_network_exporter[205118]: ERROR   08:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:56:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:56:01 compute-0 openstack_network_exporter[205118]: ERROR   08:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:56:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:56:01 compute-0 nova_compute[192567]: 2025-10-02 08:56:01.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:03 compute-0 podman[235002]: 2025-10-02 08:56:03.14505553 +0000 UTC m=+0.059643546 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 02 08:56:03 compute-0 podman[235000]: 2025-10-02 08:56:03.146126283 +0000 UTC m=+0.066102487 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true)
Oct 02 08:56:03 compute-0 podman[235001]: 2025-10-02 08:56:03.176287311 +0000 UTC m=+0.091982092 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_id=ovn_controller)
Oct 02 08:56:03 compute-0 podman[235003]: 2025-10-02 08:56:03.176312272 +0000 UTC m=+0.084430177 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:56:04 compute-0 nova_compute[192567]: 2025-10-02 08:56:04.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:56:04 compute-0 nova_compute[192567]: 2025-10-02 08:56:04.663 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:56:04 compute-0 nova_compute[192567]: 2025-10-02 08:56:04.663 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:56:04 compute-0 nova_compute[192567]: 2025-10-02 08:56:04.663 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:56:04 compute-0 nova_compute[192567]: 2025-10-02 08:56:04.663 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:56:04 compute-0 nova_compute[192567]: 2025-10-02 08:56:04.833 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:56:04 compute-0 nova_compute[192567]: 2025-10-02 08:56:04.834 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5835MB free_disk=73.46237564086914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:56:04 compute-0 nova_compute[192567]: 2025-10-02 08:56:04.834 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:56:04 compute-0 nova_compute[192567]: 2025-10-02 08:56:04.835 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:56:04 compute-0 nova_compute[192567]: 2025-10-02 08:56:04.917 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:56:04 compute-0 nova_compute[192567]: 2025-10-02 08:56:04.918 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:56:04 compute-0 nova_compute[192567]: 2025-10-02 08:56:04.935 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing inventories for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 02 08:56:05 compute-0 nova_compute[192567]: 2025-10-02 08:56:05.050 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating ProviderTree inventory for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 02 08:56:05 compute-0 nova_compute[192567]: 2025-10-02 08:56:05.051 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Updating inventory in ProviderTree for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 02 08:56:05 compute-0 nova_compute[192567]: 2025-10-02 08:56:05.079 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing aggregate associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 02 08:56:05 compute-0 nova_compute[192567]: 2025-10-02 08:56:05.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:05 compute-0 nova_compute[192567]: 2025-10-02 08:56:05.290 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Refreshing trait associations for resource provider e7f6698e-de2d-4705-8493-a3445ce0cf6e, traits: COMPUTE_STORAGE_BUS_FDC,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE41,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 02 08:56:05 compute-0 nova_compute[192567]: 2025-10-02 08:56:05.314 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:56:05 compute-0 nova_compute[192567]: 2025-10-02 08:56:05.330 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:56:05 compute-0 nova_compute[192567]: 2025-10-02 08:56:05.334 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:56:05 compute-0 nova_compute[192567]: 2025-10-02 08:56:05.334 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.499s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:56:06 compute-0 nova_compute[192567]: 2025-10-02 08:56:06.335 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:56:06 compute-0 nova_compute[192567]: 2025-10-02 08:56:06.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:56:06 compute-0 nova_compute[192567]: 2025-10-02 08:56:06.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:07 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:56:07.182 103703 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=66c4bca3-98aa-4361-8801-8722dd9a7888, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 02 08:56:08 compute-0 podman[235078]: 2025-10-02 08:56:08.181759957 +0000 UTC m=+0.087053879 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:56:08 compute-0 nova_compute[192567]: 2025-10-02 08:56:08.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:56:09 compute-0 nova_compute[192567]: 2025-10-02 08:56:09.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:56:10 compute-0 nova_compute[192567]: 2025-10-02 08:56:10.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:11 compute-0 nova_compute[192567]: 2025-10-02 08:56:11.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:12 compute-0 nova_compute[192567]: 2025-10-02 08:56:12.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:56:15 compute-0 nova_compute[192567]: 2025-10-02 08:56:15.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:15 compute-0 nova_compute[192567]: 2025-10-02 08:56:15.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:56:17 compute-0 nova_compute[192567]: 2025-10-02 08:56:17.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:17 compute-0 nova_compute[192567]: 2025-10-02 08:56:17.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:56:18 compute-0 nova_compute[192567]: 2025-10-02 08:56:18.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:56:18 compute-0 nova_compute[192567]: 2025-10-02 08:56:18.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:56:19 compute-0 podman[235102]: 2025-10-02 08:56:19.206445171 +0000 UTC m=+0.104728729 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Oct 02 08:56:20 compute-0 nova_compute[192567]: 2025-10-02 08:56:20.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:22 compute-0 nova_compute[192567]: 2025-10-02 08:56:22.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:25 compute-0 nova_compute[192567]: 2025-10-02 08:56:25.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:27 compute-0 nova_compute[192567]: 2025-10-02 08:56:27.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:29 compute-0 podman[203011]: time="2025-10-02T08:56:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:56:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:56:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:56:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:56:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Oct 02 08:56:30 compute-0 nova_compute[192567]: 2025-10-02 08:56:30.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:31 compute-0 openstack_network_exporter[205118]: ERROR   08:56:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:56:31 compute-0 openstack_network_exporter[205118]: ERROR   08:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:56:31 compute-0 openstack_network_exporter[205118]: ERROR   08:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:56:31 compute-0 openstack_network_exporter[205118]: ERROR   08:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:56:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:56:31 compute-0 openstack_network_exporter[205118]: ERROR   08:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:56:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:56:32 compute-0 nova_compute[192567]: 2025-10-02 08:56:32.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:34 compute-0 podman[235127]: 2025-10-02 08:56:34.186232259 +0000 UTC m=+0.079329069 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:56:34 compute-0 podman[235125]: 2025-10-02 08:56:34.191669148 +0000 UTC m=+0.107425542 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:56:34 compute-0 podman[235126]: 2025-10-02 08:56:34.216915483 +0000 UTC m=+0.117740083 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Oct 02 08:56:34 compute-0 podman[235133]: 2025-10-02 08:56:34.219744521 +0000 UTC m=+0.109346082 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=iscsid)
Oct 02 08:56:35 compute-0 nova_compute[192567]: 2025-10-02 08:56:35.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:37 compute-0 nova_compute[192567]: 2025-10-02 08:56:37.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:39 compute-0 podman[235208]: 2025-10-02 08:56:39.173754136 +0000 UTC m=+0.081903109 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:56:40 compute-0 nova_compute[192567]: 2025-10-02 08:56:40.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:42 compute-0 nova_compute[192567]: 2025-10-02 08:56:42.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:45 compute-0 nova_compute[192567]: 2025-10-02 08:56:45.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:56:46.026 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:56:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:56:46.027 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:56:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:56:46.027 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:56:47 compute-0 nova_compute[192567]: 2025-10-02 08:56:47.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:50 compute-0 podman[235232]: 2025-10-02 08:56:50.157031392 +0000 UTC m=+0.072404053 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Oct 02 08:56:50 compute-0 nova_compute[192567]: 2025-10-02 08:56:50.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:52 compute-0 nova_compute[192567]: 2025-10-02 08:56:52.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:55 compute-0 nova_compute[192567]: 2025-10-02 08:56:55.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:57 compute-0 nova_compute[192567]: 2025-10-02 08:56:57.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:56:59 compute-0 podman[203011]: time="2025-10-02T08:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:56:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:56:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 02 08:57:00 compute-0 nova_compute[192567]: 2025-10-02 08:57:00.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:00 compute-0 nova_compute[192567]: 2025-10-02 08:57:00.626 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:57:00 compute-0 nova_compute[192567]: 2025-10-02 08:57:00.627 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:57:00 compute-0 nova_compute[192567]: 2025-10-02 08:57:00.627 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:57:00 compute-0 nova_compute[192567]: 2025-10-02 08:57:00.645 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:57:01 compute-0 openstack_network_exporter[205118]: ERROR   08:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:57:01 compute-0 openstack_network_exporter[205118]: ERROR   08:57:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:57:01 compute-0 openstack_network_exporter[205118]: ERROR   08:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:57:01 compute-0 openstack_network_exporter[205118]: ERROR   08:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:57:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:57:01 compute-0 openstack_network_exporter[205118]: ERROR   08:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:57:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:57:02 compute-0 nova_compute[192567]: 2025-10-02 08:57:02.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:04 compute-0 nova_compute[192567]: 2025-10-02 08:57:04.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:57:04 compute-0 nova_compute[192567]: 2025-10-02 08:57:04.651 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:57:04 compute-0 nova_compute[192567]: 2025-10-02 08:57:04.652 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:57:04 compute-0 nova_compute[192567]: 2025-10-02 08:57:04.652 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:57:04 compute-0 nova_compute[192567]: 2025-10-02 08:57:04.653 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:57:04 compute-0 nova_compute[192567]: 2025-10-02 08:57:04.879 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:57:04 compute-0 nova_compute[192567]: 2025-10-02 08:57:04.881 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5852MB free_disk=73.46213150024414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:57:04 compute-0 nova_compute[192567]: 2025-10-02 08:57:04.881 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:57:04 compute-0 nova_compute[192567]: 2025-10-02 08:57:04.882 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:57:04 compute-0 nova_compute[192567]: 2025-10-02 08:57:04.964 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:57:04 compute-0 nova_compute[192567]: 2025-10-02 08:57:04.965 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:57:05 compute-0 nova_compute[192567]: 2025-10-02 08:57:05.037 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:57:05 compute-0 nova_compute[192567]: 2025-10-02 08:57:05.061 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:57:05 compute-0 nova_compute[192567]: 2025-10-02 08:57:05.064 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:57:05 compute-0 nova_compute[192567]: 2025-10-02 08:57:05.065 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:57:05 compute-0 podman[235257]: 2025-10-02 08:57:05.184192821 +0000 UTC m=+0.067345176 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 02 08:57:05 compute-0 podman[235255]: 2025-10-02 08:57:05.184249543 +0000 UTC m=+0.072980161 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a)
Oct 02 08:57:05 compute-0 podman[235258]: 2025-10-02 08:57:05.210263602 +0000 UTC m=+0.079075920 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 02 08:57:05 compute-0 podman[235256]: 2025-10-02 08:57:05.237823209 +0000 UTC m=+0.126988751 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct 02 08:57:05 compute-0 nova_compute[192567]: 2025-10-02 08:57:05.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:06 compute-0 nova_compute[192567]: 2025-10-02 08:57:06.065 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:57:06 compute-0 nova_compute[192567]: 2025-10-02 08:57:06.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:57:07 compute-0 nova_compute[192567]: 2025-10-02 08:57:07.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:09 compute-0 nova_compute[192567]: 2025-10-02 08:57:09.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:57:10 compute-0 podman[235334]: 2025-10-02 08:57:10.169957973 +0000 UTC m=+0.075709226 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 02 08:57:10 compute-0 nova_compute[192567]: 2025-10-02 08:57:10.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:11 compute-0 nova_compute[192567]: 2025-10-02 08:57:11.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:57:12 compute-0 nova_compute[192567]: 2025-10-02 08:57:12.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:12 compute-0 nova_compute[192567]: 2025-10-02 08:57:12.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:57:15 compute-0 nova_compute[192567]: 2025-10-02 08:57:15.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:15 compute-0 nova_compute[192567]: 2025-10-02 08:57:15.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:57:16 compute-0 nova_compute[192567]: 2025-10-02 08:57:16.624 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:57:16 compute-0 nova_compute[192567]: 2025-10-02 08:57:16.625 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 02 08:57:16 compute-0 nova_compute[192567]: 2025-10-02 08:57:16.641 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 02 08:57:17 compute-0 nova_compute[192567]: 2025-10-02 08:57:17.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:20 compute-0 nova_compute[192567]: 2025-10-02 08:57:20.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:20 compute-0 nova_compute[192567]: 2025-10-02 08:57:20.642 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:57:20 compute-0 nova_compute[192567]: 2025-10-02 08:57:20.642 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:57:21 compute-0 podman[235358]: 2025-10-02 08:57:21.196926587 +0000 UTC m=+0.101272321 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 02 08:57:22 compute-0 nova_compute[192567]: 2025-10-02 08:57:22.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:25 compute-0 nova_compute[192567]: 2025-10-02 08:57:25.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:26 compute-0 nova_compute[192567]: 2025-10-02 08:57:26.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:57:26 compute-0 nova_compute[192567]: 2025-10-02 08:57:26.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 02 08:57:27 compute-0 nova_compute[192567]: 2025-10-02 08:57:27.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:29 compute-0 podman[203011]: time="2025-10-02T08:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:57:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:57:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3016 "" "Go-http-client/1.1"
Oct 02 08:57:30 compute-0 nova_compute[192567]: 2025-10-02 08:57:30.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:31 compute-0 openstack_network_exporter[205118]: ERROR   08:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:57:31 compute-0 openstack_network_exporter[205118]: ERROR   08:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:57:31 compute-0 openstack_network_exporter[205118]: ERROR   08:57:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:57:31 compute-0 openstack_network_exporter[205118]: ERROR   08:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:57:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:57:31 compute-0 openstack_network_exporter[205118]: ERROR   08:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:57:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:57:32 compute-0 nova_compute[192567]: 2025-10-02 08:57:32.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:35 compute-0 nova_compute[192567]: 2025-10-02 08:57:35.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:36 compute-0 podman[235381]: 2025-10-02 08:57:36.186172597 +0000 UTC m=+0.090311260 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:57:36 compute-0 podman[235384]: 2025-10-02 08:57:36.195028672 +0000 UTC m=+0.085072337 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=iscsid, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 02 08:57:36 compute-0 podman[235383]: 2025-10-02 08:57:36.197549571 +0000 UTC m=+0.105660807 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:57:36 compute-0 podman[235382]: 2025-10-02 08:57:36.227384949 +0000 UTC m=+0.128371494 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 02 08:57:37 compute-0 nova_compute[192567]: 2025-10-02 08:57:37.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:38 compute-0 nova_compute[192567]: 2025-10-02 08:57:38.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:57:40 compute-0 nova_compute[192567]: 2025-10-02 08:57:40.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:41 compute-0 podman[235463]: 2025-10-02 08:57:41.152094343 +0000 UTC m=+0.070751022 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 02 08:57:42 compute-0 nova_compute[192567]: 2025-10-02 08:57:42.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:45 compute-0 nova_compute[192567]: 2025-10-02 08:57:45.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:57:46.028 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:57:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:57:46.029 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:57:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:57:46.029 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:57:47 compute-0 nova_compute[192567]: 2025-10-02 08:57:47.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:50 compute-0 nova_compute[192567]: 2025-10-02 08:57:50.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:52 compute-0 podman[235488]: 2025-10-02 08:57:52.199402099 +0000 UTC m=+0.105914536 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.6)
Oct 02 08:57:52 compute-0 nova_compute[192567]: 2025-10-02 08:57:52.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:55 compute-0 nova_compute[192567]: 2025-10-02 08:57:55.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:57 compute-0 nova_compute[192567]: 2025-10-02 08:57:57.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:57:59 compute-0 podman[203011]: time="2025-10-02T08:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:57:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:57:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3022 "" "Go-http-client/1.1"
Oct 02 08:58:00 compute-0 nova_compute[192567]: 2025-10-02 08:58:00.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:01 compute-0 openstack_network_exporter[205118]: ERROR   08:58:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:58:01 compute-0 openstack_network_exporter[205118]: ERROR   08:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:58:01 compute-0 openstack_network_exporter[205118]: ERROR   08:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:58:01 compute-0 openstack_network_exporter[205118]: ERROR   08:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:58:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:58:01 compute-0 openstack_network_exporter[205118]: ERROR   08:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:58:01 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:58:01 compute-0 nova_compute[192567]: 2025-10-02 08:58:01.652 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:58:01 compute-0 nova_compute[192567]: 2025-10-02 08:58:01.653 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 02 08:58:01 compute-0 nova_compute[192567]: 2025-10-02 08:58:01.654 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 02 08:58:01 compute-0 nova_compute[192567]: 2025-10-02 08:58:01.693 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 02 08:58:02 compute-0 nova_compute[192567]: 2025-10-02 08:58:02.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:05 compute-0 nova_compute[192567]: 2025-10-02 08:58:05.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:05 compute-0 nova_compute[192567]: 2025-10-02 08:58:05.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.650 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.652 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.652 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.653 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.862 2 WARNING nova.virt.libvirt.driver [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.865 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5851MB free_disk=73.46215057373047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.866 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.866 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.931 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.932 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.958 2 DEBUG nova.compute.provider_tree [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed in ProviderTree for provider: e7f6698e-de2d-4705-8493-a3445ce0cf6e update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.976 2 DEBUG nova.scheduler.client.report [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Inventory has not changed for provider e7f6698e-de2d-4705-8493-a3445ce0cf6e based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.979 2 DEBUG nova.compute.resource_tracker [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 02 08:58:06 compute-0 nova_compute[192567]: 2025-10-02 08:58:06.979 2 DEBUG oslo_concurrency.lockutils [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:58:07 compute-0 podman[235509]: 2025-10-02 08:58:07.185850042 +0000 UTC m=+0.090870077 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 02 08:58:07 compute-0 podman[235511]: 2025-10-02 08:58:07.202253142 +0000 UTC m=+0.092256240 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 02 08:58:07 compute-0 podman[235512]: 2025-10-02 08:58:07.21761897 +0000 UTC m=+0.111689235 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 02 08:58:07 compute-0 nova_compute[192567]: 2025-10-02 08:58:07.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:07 compute-0 podman[235510]: 2025-10-02 08:58:07.224484633 +0000 UTC m=+0.129674174 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, tcib_managed=true)
Oct 02 08:58:10 compute-0 nova_compute[192567]: 2025-10-02 08:58:10.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:11 compute-0 nova_compute[192567]: 2025-10-02 08:58:11.981 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:58:12 compute-0 podman[235589]: 2025-10-02 08:58:12.189592903 +0000 UTC m=+0.093612572 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:58:12 compute-0 nova_compute[192567]: 2025-10-02 08:58:12.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:12 compute-0 nova_compute[192567]: 2025-10-02 08:58:12.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:58:13 compute-0 nova_compute[192567]: 2025-10-02 08:58:13.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:58:15 compute-0 nova_compute[192567]: 2025-10-02 08:58:15.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:17 compute-0 nova_compute[192567]: 2025-10-02 08:58:17.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:17 compute-0 nova_compute[192567]: 2025-10-02 08:58:17.625 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:58:20 compute-0 nova_compute[192567]: 2025-10-02 08:58:20.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:21 compute-0 nova_compute[192567]: 2025-10-02 08:58:21.620 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:58:22 compute-0 nova_compute[192567]: 2025-10-02 08:58:22.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:22 compute-0 nova_compute[192567]: 2025-10-02 08:58:22.623 2 DEBUG oslo_service.periodic_task [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 02 08:58:22 compute-0 nova_compute[192567]: 2025-10-02 08:58:22.624 2 DEBUG nova.compute.manager [None req-f440359b-9ca0-4f03-a336-57f13db3b1f9 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 02 08:58:23 compute-0 podman[235611]: 2025-10-02 08:58:23.162656101 +0000 UTC m=+0.079731402 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Oct 02 08:58:25 compute-0 nova_compute[192567]: 2025-10-02 08:58:25.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:27 compute-0 nova_compute[192567]: 2025-10-02 08:58:27.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:29 compute-0 podman[203011]: time="2025-10-02T08:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:58:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:58:29 compute-0 podman[203011]: @ - - [02/Oct/2025:08:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Oct 02 08:58:30 compute-0 nova_compute[192567]: 2025-10-02 08:58:30.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:31 compute-0 openstack_network_exporter[205118]: ERROR   08:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:58:31 compute-0 openstack_network_exporter[205118]: ERROR   08:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 02 08:58:31 compute-0 openstack_network_exporter[205118]: ERROR   08:58:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 02 08:58:31 compute-0 openstack_network_exporter[205118]: ERROR   08:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 02 08:58:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:58:31 compute-0 openstack_network_exporter[205118]: ERROR   08:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 02 08:58:31 compute-0 openstack_network_exporter[205118]: 
Oct 02 08:58:32 compute-0 nova_compute[192567]: 2025-10-02 08:58:32.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:35 compute-0 nova_compute[192567]: 2025-10-02 08:58:35.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:37 compute-0 nova_compute[192567]: 2025-10-02 08:58:37.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:38 compute-0 podman[235632]: 2025-10-02 08:58:38.1835956 +0000 UTC m=+0.088248117 container health_status 66d98910ff67c8d48af5fd821f9d11267fd669aeb605f0f4a1b89b6bd4e527fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 02 08:58:38 compute-0 podman[235634]: 2025-10-02 08:58:38.19809796 +0000 UTC m=+0.091077394 container health_status bc6a54c981d1e2865e2a6b4bf6909c94ee26f73b5851a49fbe1c80adadc67c0b (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 02 08:58:38 compute-0 podman[235633]: 2025-10-02 08:58:38.225514343 +0000 UTC m=+0.120390265 container health_status 7fc8617e6673a6865a99ab13e238e76dc0ef2a2a4cd1cbac7da3710a6a8d63a6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 02 08:58:38 compute-0 podman[235635]: 2025-10-02 08:58:38.226272517 +0000 UTC m=+0.112446889 container health_status c3c18de1cc56af04fc308fcbd4df74ec8d6a2d517a53b4b1d254512294a74fc0 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=a0eac564d779a7eaac46c9816bff261a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct 02 08:58:40 compute-0 nova_compute[192567]: 2025-10-02 08:58:40.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:42 compute-0 nova_compute[192567]: 2025-10-02 08:58:42.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:43 compute-0 podman[235714]: 2025-10-02 08:58:43.172494849 +0000 UTC m=+0.080742803 container health_status 922b9d003940fda5af15c8780df7bef5c4e73591125bcfe462360dcb4b44f3f3 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 02 08:58:44 compute-0 sshd-session[235738]: Invalid user hms from 193.32.162.151 port 41970
Oct 02 08:58:44 compute-0 sshd-session[235738]: pam_unix(sshd:auth): check pass; user unknown
Oct 02 08:58:44 compute-0 sshd-session[235738]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.32.162.151
Oct 02 08:58:45 compute-0 nova_compute[192567]: 2025-10-02 08:58:45.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:58:46.030 103703 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 02 08:58:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:58:46.031 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 02 08:58:46 compute-0 ovn_metadata_agent[103698]: 2025-10-02 08:58:46.031 103703 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 02 08:58:46 compute-0 sshd-session[235738]: Failed password for invalid user hms from 193.32.162.151 port 41970 ssh2
Oct 02 08:58:47 compute-0 sshd-session[235740]: Accepted publickey for zuul from 192.168.122.10 port 38932 ssh2: ECDSA SHA256:6/ItOgjcxtX5190Tph2f93zR90/w8uxqrUSxh6/0UQY
Oct 02 08:58:47 compute-0 systemd-logind[827]: New session 46 of user zuul.
Oct 02 08:58:47 compute-0 systemd[1]: Started Session 46 of User zuul.
Oct 02 08:58:47 compute-0 sshd-session[235740]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 02 08:58:47 compute-0 nova_compute[192567]: 2025-10-02 08:58:47.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:47 compute-0 sudo[235744]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Oct 02 08:58:47 compute-0 sudo[235744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 02 08:58:47 compute-0 sshd-session[235738]: Connection closed by invalid user hms 193.32.162.151 port 41970 [preauth]
Oct 02 08:58:50 compute-0 nova_compute[192567]: 2025-10-02 08:58:50.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:51 compute-0 ovs-vsctl[235917]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 02 08:58:52 compute-0 nova_compute[192567]: 2025-10-02 08:58:52.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:52 compute-0 virtqemud[192112]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 02 08:58:52 compute-0 virtqemud[192112]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 02 08:58:52 compute-0 virtqemud[192112]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 02 08:58:53 compute-0 podman[236128]: 2025-10-02 08:58:53.344947154 +0000 UTC m=+0.090803396 container health_status a8c7d6a9f10fd497877aa82139fa3c080ee3548c4523b21706bcde82e14791f9 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Oct 02 08:58:54 compute-0 crontab[236361]: (root) LIST (root)
Oct 02 08:58:55 compute-0 nova_compute[192567]: 2025-10-02 08:58:55.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:56 compute-0 systemd[1]: Starting Hostname Service...
Oct 02 08:58:56 compute-0 systemd[1]: Started Hostname Service.
Oct 02 08:58:57 compute-0 nova_compute[192567]: 2025-10-02 08:58:57.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 02 08:58:59 compute-0 podman[203011]: time="2025-10-02T08:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 02 08:58:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19547 "" "Go-http-client/1.1"
Oct 02 08:58:59 compute-0 podman[203011]: @ - - [02/Oct/2025:08:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3022 "" "Go-http-client/1.1"
Oct 02 08:59:00 compute-0 nova_compute[192567]: 2025-10-02 08:59:00.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
